>not like this amdtards

>not like this amdtards

Beware Cat Shirt $21.68

Rise, Grind, Banana Find Shirt $21.68

Beware Cat Shirt $21.68

  1. 1 month ago
    Anonymous

    >OpenGL
    into the trash it goes

    • 1 month ago
      Anonymous

      Vulkan didn't exist when Minekampf came out.

      • 1 month ago
        Anonymous

        Then add it, it's only one of the most popular games ever backed by one of the biggest tech companies in the world

        • 1 month ago
          Anonymous

          The company in question hates you though. The fact that you're playing the moddable DRM-free Java/OpenGL version instead of the unmoddable restricted C++/D3D one is an insult to them. Why would they add anything you want?

        • 1 month ago
          Anonymous

          >then add it
          It is not that easy. Not only minecraft uses opengl but old one. Migrating from OpenGL 4.2 to Vulkan is quite "direct" imo, there's more boilerplate but you gain more fine control over the command buffer (thus, making it more parallel, predominant theme on opengl 4 in the first place), but from OpenGL 3.2 (current minecraft version as far as i know) there's so many things that should be changed that you will pretty much rewrite the engine from the ground up. Not feasible, they would had to dedicate an ENTIRE YEAR just to rewrite that shit again with the whole dev team working on this.

          That's why i hate those "use vulkan" comments. They don't actually know what they are asking for and how shit it is to just "use vulkan". When you are using OpenGL or DirectX, you are just using a graphics API provided by the OS, when you are using vulkan, you are almost making an entire graphics DRIVER.

          Capcha: HANGM.

          • 1 month ago
            Anonymous

            Right, it's not feasible for such a small indie conpany, they don't even have enough resources to add all new content they want, that's why they have to decide with polls

          • 1 month ago
            Anonymous

            Well, if microsoft wants to kill minecraft without new updates for an entire year, maybe they have resources to do it. I was from a time that Notch still owned Mojang, i've waited longer periods without update and i prefer it that way, everyone have time to get on the same page, etc, but the zoomers today are too addicted in dopamine for that, and there's too less of us that would endure such a wait.

          • 1 month ago
            Anonymous

            >muh zoomers
            >he doesn't know about the update poll incident

          • 1 month ago
            Anonymous

            it is THAT easy moron there's a handful of vulkan renderers for minecraft done in free time by some suicidal troons let alone a 4 trillion dollar company

        • 1 month ago
          Anonymous

          lol why do amdrones blame literally everyone but AMD? gets banned for tampering with Vidya to cope with shit tech? blame valve. Your cpu fries because AMD didn't have any safety checks in its bios, or in the silicon? Uhh wtf Asus!! No one cares or support amd tech in AI? Nvidia is behind this.

          • 1 month ago
            Anonymous

            I agree without everything but the bios. There isn't a single "safety" check on any motherboard. If you frick up the bios on any motherboard in the right way you can kill and will kill the CPU. The VRM is just doing it's job. It does it on everything from Motherboards to GPUs. There is no "safety" on any of them.

          • 1 month ago
            Anonymous

            no, the AGESA firmware was literally made by AMD. Asus followed amd specs and provided the AGESA binaries, but firmware would just let the cpu eat moronic voltages

          • 1 month ago
            Anonymous

            >Asus followed amd specs and provided the AGESA binaries, but firmware would just let the cpu eat moronic voltages
            Clearly Asus has some kind of decision into what voltage the CPU got because the failure modes of each brand where different. Asus killed both the CPU and board while gigabyte mostly just killed the CPU.

            It was also discovered Asus had a mechanism to deal with aging and is why Asus boards caused so much damage. If the CPU wasn't responsive on POST the firmware continuously increased the SoC voltage in an attempt to get it to boot and ended up pumping hundreds of amps into the CPU. CPU overcurrent protection should have worked but Asus boards had that functionality broken.

            I'm not one to pin absolutely zero blame on board vendors. This was an issue in the first place to placate demand from consumers for faster and better memory compatibility, they had an option to just say no and point to OFFICAL AMD specs (and not just engineer hearsay about muh sweet spot) for memory speeds but no they decided edging on high voltages was completely ok.

            AMD does deserve some blame for apparently not being explicitly clear on what is "safe" voltages to vendors and making sure boards actually protected the CPU but then there is a discussion how much do you really want AMD to control your precious board vendors?

          • 1 month ago
            Anonymous

            okay, that's actually a genuinely good point. I do wonder why this doesn't happen more often then. I was under a naive impression that protections were also on chip (before this unfolded)

          • 1 month ago
            Anonymous

            ? I have Nvidia

        • 1 month ago
          Anonymous

          Oh no, what will amdtards do without this Minecraft mod that is the only thing in the world using mesh shaders and probably isn't even compatible with more useful visual mods like distant horizons and shaders.

          t. 3080 user

          It takes them years and huge team of people to add single mob or biome. They will never replace entire graphical backend.

          • 1 month ago
            Anonymous

            >It takes them years and huge team of people to add single mob or biome.
            Kek meanwhile a decent modder does that in a week, what are they even doing at Mojang?

          • 1 month ago
            Anonymous

            I'm like 90% sure the answer is that they've transitioned to a fully corporate development model, which means that devs get to estimate 4 weeks for simple changes, it gets scheduled in the sprint, then they browse reddit/hackernews for three weeks and four days before sitting down and getting the change done. Then it takes another 2 weeks of ping-ponging to pass code review, during which time the dev will continue watching youtube or browsing stackoverflow rather than bothering to pick up the next one.
            And before any of that can even happen every new mob has to be broken down by the "product owner" into a couple dozen tickets (each of which having a workflow as described above), and of course the design team has to approve and sign off on it and its functionality to ensure "user accessibility" and conforming to the "brand" and official standards. (Of which the latter is at least somewhat understandable, if a modder adds weird jank it's whatever, but Mojang will always be more conservative with its official additions.)

    • 1 month ago
      Anonymous

      just like amd

      >impotent nvidia cuck rage
      I don't think about you at all.

      I have an amd card.

      Just wait (TM)
      https://community.amd.com/t5/opengl-vulkan/is-mesh-shader-support-planned-for-opengl/m-p/650469

      two more weeks

      • 1 month ago
        Anonymous

        >just like amd
        based
        in 3 years even intel will blow AMD gpus out of the water

        • 1 month ago
          Anonymous

          >in 3 years even intel will blow AMD gpus out of the water
          unironically by the next gen or two

          • 1 month ago
            Anonymous

            I don't know about next gen. I'm running Arc A750 and while software improvements have been massive, they're still at like 3050 - 3060 range. Atleast software is good, when it's supported. Frick AMD's compute ecosystem, what a cliusterfrick

          • 1 month ago
            Anonymous

            I'm curious, what motivates you to buy an experimental, in-development GPU where you have to hope for "software improvements down the line, maybe"?
            Especially from the likes of intel. I could understand if it was some kind of brand new underdog manufacturer and you bought it as just a tech enthusiast to be an early adopter and follow along how new tech evolves, and in part also to support the new manufacturer and help fund improved future versions.
            But Intel is the opposite of a small underdog brand new tech and definitely does not need any of your sales to "help fund" further development.

          • 1 month ago
            Anonymous

            nta but just funding amds and nvidia competitor

          • 1 month ago
            Anonymous

            How

          • 1 month ago
            Anonymous

            Intel is the small underdog brand new tech now.

          • 1 month ago
            Anonymous

            Because NVidia has gone delusional with pricing and AMD is happy to sit around in second place while enjoying the "new normal" prices.
            Both of them need a kick in the ass, buying Intel is a statement.

          • 1 month ago
            Anonymous

            >Both of them need a kick in the ass, buying Intel is a statement.
            Based.

          • 1 month ago
            Anonymous

            >As if there is only monopoly or duopoly and all hell breaks loose when you introduce a third company and for some reason the third company can't price-fix with the other two.

            Nothing will change. Prices will go up and Nvidia, AMD and Intel negotiate all the prices and shit behind closed doors.

          • 1 month ago
            Anonymous

            I needed a new GPU
            I am using Linux
            I wanted to mess around with PyTorch etc.
            I didn't want AMD GPU, since AMD Compute is a joke
            Intel GPU was cheap at the time.
            It was already few months after Arc release so I knew what to expect

            I've been pretty happy with my Arc GPU. It does what I expect it do and Intel is actually improving their software.

            When I bought the Arc GPU, you could crash the AMD driver on linux just by running memory bandwith test. Lmfao
            https://github.com/ROCm/rocm_bandwidth_test

          • 1 month ago
            Anonymous

            Interesting, so their compute is better than AMD's?
            >cheap
            >linux works
            >compute actually works
            That's actually a very interesting usecase, fair enough

          • 1 month ago
            Anonymous

            >fully featured hardware and software is a use case now for freetards
            fricking untermenschen

          • 1 month ago
            Anonymous

            AMD's compute is a complete joke, not hard to be better than that

          • 1 month ago
            Anonymous

            >Interesting, so their compute is better than AMD's?
            Yes. It's hard to overstate how bad AMD compute stack is for average consumer. It's probably good enough, when you are a big player and got direct contacts to AMD engineers, but as a average consumer you're fricked.
            [...]
            this.

            I honestly thought the main problem was CUDA monopoly, if AMD is really that bad then it makes a lot more sense

          • 1 month ago
            Anonymous

            Nah it's everything, ROCm is badly supported and works terribly even when "supported"
            AMD GPUs are only good for playing modern-ish video games (older DirectX versions run like shit), anything else they're completely useless

          • 1 month ago
            Anonymous

            skill issue. again. you run opencl "headless". no rocm this way
            its an issue to begin with only bc youre a consoomer poorgay + goyim arent allowed into compute

            which is great bc smart anons find workarounds.
            rich anons dont give a frick
            and the only ones who are affected are the /lmg/ rabble who pay through the arse for ngreedia stuff.

            everything is as should be.
            ignorance should be painful.

          • 1 month ago
            Anonymous

            skill issue.
            https://www.tomshardware.com/news/amd-powered-frontier-supercomputer-breaks-the-exascale-barrier-now-fastest-in-the-world

            people who talk about "nuh aymd compute" cant do compute for shit.
            theyre the pedos of this board who are amgery they cant run their cp-generators on ayyymd HW

            cuz they cant do compute for shit. all theyre able to do is to follow a tutorial on how to download a git.
            thus their opinions are void and should be automatically discarded

            very based. I will keep buying AMD. AMD software might not work on release and ROCm support might be added months after hardware release, but this is why AMD is fine wine (TM)

          • 1 month ago
            Anonymous

            >buying early access hardware in the hope it gets better later
            that's not fine wine lmao

          • 1 month ago
            Anonymous

            full disclosure: i buy amd cuz its cheap for the performance it offers.
            cant wait to lay my hands on the newest generation with that juicy, plump looking "cache", and to see for myself what its actually worth.

            im a poorgay myself, its just that i got the brains to deal with amd's lack of interest in the poorgay market
            (compared to ngreedia who has NO poorgay market)
            (things of life (TM))

          • 1 month ago
            Anonymous

            >>im a poorgay myself
            at least you are honest moron

          • 1 month ago
            Anonymous

            >at least you are honest moron
            yes i am
            why are you seething about that? wtf is wrong with you?

          • 1 month ago
            Anonymous

            full disclosure: i buy amd cuz its cheap for the performance it offers.
            cant wait to lay my hands on the newest generation with that juicy, plump looking "cache", and to see for myself what its actually worth.

            im a poorgay myself, its just that i got the brains to deal with amd's lack of interest in the poorgay market
            (compared to ngreedia who has NO poorgay market)
            (things of life (TM))

            (cont.)
            if you wanna have a look at AMD's premiums, radeon VII can still be bought new, and it goes for 2x the price it was launched at.

            now thats fine wine.

          • 1 month ago
            Anonymous

            I bought a VII day one, still have it to this day. Had to flash it to even make it work with current systems. A small roadblock but still, you have to question what is going on over there on team red.

          • 1 month ago
            Anonymous

            Thanks for the input, ebussy

          • 1 month ago
            Anonymous

            go back to your desiganted child porn thread
            you dont have the brains to take part in this discussion

          • 1 month ago
            Anonymous

            Don't pop a vein, ebussy, there's no usecase for that
            Captcha G2AAMD

          • 1 month ago
            Anonymous

            >still seething
            look
            the whole therad knows that
            1. youre a pedo
            2. youre seething for some fricking reason (prolly bc of point 1)
            3. youre not very smart.

            why do you keep doing this to yourself?

          • 1 month ago
            Anonymous

            What's the usecase for that projection, ebussy?

          • 1 month ago
            Anonymous

            learn english stupid fricking pedo

            >nvidia doesnt have a cache architecture. at all.
            because they don't need it
            >and they have way less of it.
            >implying it's innovation to solder more memory chips to the pcb
            >whether you like it or not, amd does innovate even in gpus, and their innovation is actually smart.
            name three inovations they came up with themselves that are also widely used and not just memes
            >nvidia's approach is just to bruteforce everything. but thats not efficient.
            nah that's amds job with cranking up those frequencies

            thats cope

            >nvidia's approach is just to bruteforce everything. but thats not efficient.
            You mean how AMD bruteforces raytracing and matrix operations with shader cores, while Nvidia has hardware acceleration for that?

            fair point. kindof.
            because amd ends up with superior generalist cores.
            but then thats not the assignment with gpus. gpus are made to process graphics... or are they?
            something something markets and engineering considerations...
            in short were as much both right, and both wrong. it depends on your point of view on what a gpu is meant to be.

            >nvidia's approach is just to bruteforce everything. but thats not efficient.
            so AMD slapping more memory (cache) on gpu isn't bruteforcing?

            no, not at all.
            cache is local memory, but managed.
            also its shared (to a bigger degree) between your compute cores.
            the "wiring" is much more complex than what you would find in an nvidia consumer gpu.

          • 1 month ago
            Anonymous

            >all this and still can't touch the 4090

          • 1 month ago
            Anonymous

            no, its the opposite.
            in raw performance xtx 7900 beats the 4900
            like i wrote, your gaymershit has no meaning whatsoever in big boi applications.

          • 1 month ago
            Anonymous

            >still worse in ray tracing
            >still less memory
            >still worse at video rendering
            >still worse at trans coding
            >still worse in blender
            >still worse in 3ds max
            show me the "raw" performance

          • 1 month ago
            Anonymous

            https://www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889
            https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941

            its under the rubric "theoretical performance"

            aight gtg, IQfyermins.
            ill be back in an hour or so

          • 1 month ago
            Anonymous

            theory and practice are two different things neet

          • 1 month ago
            Anonymous

            >doesnt understand basic technical terminology
            i think you really should go back
            aight
            cya in an hour

          • 1 month ago
            Anonymous

            They use as a basis their reviews, which for a long time used a mere 5800X as a CPU to test.
            This 5800X is probably a bottleneck for the 4090's power.

          • 1 month ago
            Anonymous

            >trans coding
            kek
            btw imagine transcoding with a fricking hw encoder, yes nvidia's nvenc is better, but like, it's still a piece of shit compared to any software encoder and there's no real reason to use it for transcoding, all it's got at is game recording and streaming

          • 1 month ago
            Anonymous

            What big boi applications ? Literally everything that is important uses CUDA.

          • 1 month ago
            Anonymous

            >learn english
            >were as much both right, and both wrong
            >stupid fricking pedo
            >projection continues

          • 1 month ago
            Anonymous

            no, its the opposite.
            in raw performance xtx 7900 beats the 4900
            like i wrote, your gaymershit has no meaning whatsoever in big boi applications.

            These posts just makes me want to sell my 7900 XTX for an nvidia card

          • 1 month ago
            Anonymous

            please do.
            it is the reason i can get cheap shit

            BIG BOY 590
            BIG BOY 590
            BIG BOY 590
            [...]
            cope cope cope

            yeah. big boy 590.
            my latest stat model was ~110 MB.
            if you need more, thats skill issue

          • 1 month ago
            Anonymous

            (cont.)
            im back btw
            feel free to open ass itt, IQfyermin

          • 1 month ago
            Anonymous

            skill issue.
            https://www.tomshardware.com/news/amd-powered-frontier-supercomputer-breaks-the-exascale-barrier-now-fastest-in-the-world

            people who talk about "nuh aymd compute" cant do compute for shit.
            theyre the pedos of this board who are amgery they cant run their cp-generators on ayyymd HW

            cuz they cant do compute for shit. all theyre able to do is to follow a tutorial on how to download a git.
            thus their opinions are void and should be automatically discarded

          • 1 month ago
            Anonymous

            skill issue. again. you run opencl "headless". no rocm this way
            its an issue to begin with only bc youre a consoomer poorgay + goyim arent allowed into compute

            which is great bc smart anons find workarounds.
            rich anons dont give a frick
            and the only ones who are affected are the /lmg/ rabble who pay through the arse for ngreedia stuff.

            everything is as should be.
            ignorance should be painful.

            full disclosure: i buy amd cuz its cheap for the performance it offers.
            cant wait to lay my hands on the newest generation with that juicy, plump looking "cache", and to see for myself what its actually worth.

            im a poorgay myself, its just that i got the brains to deal with amd's lack of interest in the poorgay market
            (compared to ngreedia who has NO poorgay market)
            (things of life (TM))

            >He spends his own time to fix mistakes made by a company that is worth 200B
            not everyone is a NEET like you anon

          • 1 month ago
            Anonymous

            im not a neet tho
            i got education, experience, and training.

            I bought a VII day one, still have it to this day. Had to flash it to even make it work with current systems. A small roadblock but still, you have to question what is going on over there on team red.

            yeah, they piss on the pro-sumer market.
            you are expected to have the knowledge to deal with a lack of support when you do business with amd.
            or you buy their instinct line of products, and then they send a barely legal girl to deliver you the component and give you head as a bonus

            when i was snooping for solutions to my problems i saw an exchange that implied their engineers allocated to compute on (pro) consumer tier ARE THE SAME PEOPLE WHO THEN DO CUSTOMER SUPPORT ON THEIR FORUMS

            in my mind, thats olympic levels of money-pinching

          • 1 month ago
            Anonymous

            despite all of that, you still buy AMD products? interesting

          • 1 month ago
            Anonymous

            yeah
            they have a smart architecture. i get insane performance out of amd's boards.
            to get the equivalent with ngreedia i would have to pay several times more... bc then im paying for all the bells and whistles which i dont fundamentally need.

            in practice the only thing you need rocm/cuda framework for is to profile your code.
            but if you can write openCL/CUDA writing benchmarks is trivial.

            with ngreedia youre literally paying premium for a couple of tools you can write yourself
            and idk about you but burning the difference on drugs and prostitutes is a better investment to me than spending it on what amounts to a couple hundred lines of code.

            finally, the performance difference between nvidia and amd is really not that big, and sometimes goes in the direction opposite to what consumer-space has been taught to think.

            mind that in compute world, "gayme benchmarks" are result of bad coding.
            when you do big boy stuff on your gpu, its the "theoretical performance" that interests you.
            and lo and behold!
            rx 7900 XTX is actually better for compute than an rtx 4090.
            for half the price. but youre gonna have to code smarter, and youre prolly gonna have to do some troubleshooting.
            idk about you, but i always code smart, and for 1000 bucks im willlinng to do some troubleshooting

          • 1 month ago
            Anonymous

            oh cool. can you show some interesting projects you've made?

          • 1 month ago
            Anonymous

            how? i wont upload my code, im not a freetard.
            and all you will see are numbers and letters in a console prompt with my explanation as sole indication as to what is going on.

          • 1 month ago
            Anonymous

            fake and gay

          • 1 month ago
            Anonymous

            cope

          • 1 month ago
            Anonymous

            >rx 590
            Fricking kek

          • 1 month ago
            Anonymous

            >200$ gpu released in 2017 that's still perfectly usable today
            >kek
            nvidiots stay mad, how is your 3 gb 1060 doing?

          • 1 month ago
            Anonymous

            BIG BOY BIG BOY BIG BOY BIG BOY

          • 1 month ago
            Anonymous

            How many 4090s have you been gapping with your 590?

          • 1 month ago
            Anonymous

            I have a 1080 ti but whatever helps you cope at night

          • 1 month ago
            Anonymous

            >secure boot disabled
            ???

          • 1 month ago
            Anonymous

            BIG BOY 590
            BIG BOY 590
            BIG BOY 590

            have a nice day you third world Black person your country should be fricking glassed.

            cope cope cope

          • 1 month ago
            Anonymous

            yeah
            they have a smart architecture. i get insane performance out of amd's boards.
            to get the equivalent with ngreedia i would have to pay several times more... bc then im paying for all the bells and whistles which i dont fundamentally need.

            in practice the only thing you need rocm/cuda framework for is to profile your code.
            but if you can write openCL/CUDA writing benchmarks is trivial.

            with ngreedia youre literally paying premium for a couple of tools you can write yourself
            and idk about you but burning the difference on drugs and prostitutes is a better investment to me than spending it on what amounts to a couple hundred lines of code.

            finally, the performance difference between nvidia and amd is really not that big, and sometimes goes in the direction opposite to what consumer-space has been taught to think.

            mind that in compute world, "gayme benchmarks" are result of bad coding.
            when you do big boy stuff on your gpu, its the "theoretical performance" that interests you.
            and lo and behold!
            rx 7900 XTX is actually better for compute than an rtx 4090.
            for half the price. but youre gonna have to code smarter, and youre prolly gonna have to do some troubleshooting.
            idk about you, but i always code smart, and for 1000 bucks im willlinng to do some troubleshooting

            >muh amd beats nvidia
            >i can prove it
            >doesn't even have the amd gpu he's shilling

          • 1 month ago
            Anonymous

            That's compute focused CDNA, not mainstream RDNA. Try again.
            AMD does not support any mainstream GPU for compute under Linux other than RX 7900 XT/XTX and Radeon VII of all things - https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html
            Meanwhile NOVIDEO supports literally every GPU with CUDA.

          • 1 month ago
            Anonymous

            They're dropping Vega (which includes Vega II) lmao

          • 1 month ago
            Anonymous

            Radeon VII bros... How come Lisa don't want us, man?

          • 1 month ago
            Anonymous

            dont. just dont.
            im a polarisbro.
            we have seen it coming from a long fricking while
            mama su want fricking money. she has big breasts but there is no milk for us.

          • 1 month ago
            Anonymous

            >AMD-Powered Exascale Supercomputer Has a System Failure Every Few Hours

            >Currently, some of the problems are apparently related to the AMD Instinct GPU accelerators. “The issues span lots of different categories, the GPUs are just one,” said Whitt. He said the trouble is pretty evenly spread out amongst Frontier's various hardware. Also, the issues apparently arise when the computer is executing extremely demanding workloads, according to the report. Whitt says running a benchmark is a different can of worms compared with running scientific applications.

            https://www.extremetech.com/extreme/340136-amd-powered-exascale-supercomputer-has-a-system-failure-every-few-hours

          • 1 month ago
            Anonymous

            HAHAHAHAHAHHAHAHAHAHAHAHAHAHAHHAHAHAHAHAHAHHAHAHAHAHAHAHAHHAHAHAHAHAHAHAHHAHAHAHAHAHAH
            even they got ayyymd'ed

          • 1 month ago
            Anonymous

            >supercomputer is a memey mess of broken shit
            I mean, is it surprising?

          • 1 month ago
            Anonymous

            >Interesting, so their compute is better than AMD's?
            Yes. It's hard to overstate how bad AMD compute stack is for average consumer. It's probably good enough, when you are a big player and got direct contacts to AMD engineers, but as a average consumer you're fricked.

            AMD's compute is a complete joke, not hard to be better than that

            this.

          • 1 month ago
            Anonymous
          • 1 month ago
            Anonymous

            >I wanted to mess around with PyTorch etc.
            when you're not gayming and you need a GPU CUDA really is a dealbreaker

          • 1 month ago
            Anonymous

            >PyTorch etc.
            >I didn't want AMD GPU, since AMD Compute is a joke
            Understatement of the century. My RX5700 isn't long in the tooth by any means for the kind of gaming I do.
            >want to try AIshit
            >Lisa says "lol frick you"

          • 1 month ago
            Anonymous

            >RX5700
            >want to try AIshit
            >Lisa says "lol frick you"
            with a subtle undertone of "actually if you recompile literally everything yourself it will work but stable diffusion will still be slow"

          • 1 month ago
            Anonymous

            True but it's getting better. nVidia is pissing so many people off there's a great impetus to getting all the popular AI libraries working on anything that's not nVidia.

          • 1 month ago
            Anonymous

            >Intel is the opposite of a small underdog brand new tech

          • 1 month ago
            Anonymous

            Now put NVIDIA next to them.

          • 1 month ago
            Anonymous

            In reality outside of marketbrained mutts amd is not even a competitor to intel with intel owning their own facilities and being so important to the big glowBlack person they are a call away from 200 million dollar donation from joe bida

          • 1 month ago
            Anonymous

            What is this comparison?

          • 1 month ago
            Anonymous

            this
            intel lost.
            amdchads keep winning
            amd was created literally to break intel monopoly

          • 1 month ago
            Anonymous

            too much winning is bad tho
            R7000 delaminating silicon got swept under the rug apparently
            and it till runs at high temps by default.
            planned obsolescence much? (my bulldozer is still running)

          • 1 month ago
            Anonymous

            >I'm curious, what motivates you to buy an experimental, in-development GPU where you have to hope for "software improvements down the line, maybe"?

            Ask people who have been buying AMD

        • 1 month ago
          Anonymous

          >in 3 years even intel will blow AMD gpus out of the water
          Nah AMD is forced to innovate, they already have the contract for next gen consoles.

          • 1 month ago
            Anonymous

            they had those since last gen and it didn't help their desktop gpus at all

          • 1 month ago
            Anonymous

            >they already have the contract for next gen consoles.
            because they're only company willing to bend over backwards to create custom graphics solutions
            >AMD is forced to innovate
            you mean play catchup like they always do (in graphics)

          • 1 month ago
            Anonymous

            they still play catchup on both sides when it comes to idle power consumption and features

          • 1 month ago
            Anonymous

            they still play catchup on both sides when it comes to idle power consumption and features

            >catch up
            idk man
            nvidia doesnt have a cache architecture. at all.

            they have local memory, which is not the same.
            and they have way less of it.
            whether you like it or not, amd does innovate even in gpus, and their innovation is actually smart.
            nvidia's approach is just to bruteforce everything. but thats not efficient.

          • 1 month ago
            Anonymous

            >nvidia's approach is just to bruteforce everything. but thats not efficient.
            so AMD slapping more memory (cache) on gpu isn't bruteforcing?

          • 1 month ago
            Anonymous

            >nvidia doesnt have a cache architecture. at all.
            because they don't need it
            >and they have way less of it.
            >implying it's innovation to solder more memory chips to the pcb
            >whether you like it or not, amd does innovate even in gpus, and their innovation is actually smart.
            name three inovations they came up with themselves that are also widely used and not just memes
            >nvidia's approach is just to bruteforce everything. but thats not efficient.
            nah that's amds job with cranking up those frequencies

          • 1 month ago
            Anonymous

            >nvidia's approach is just to bruteforce everything. but thats not efficient.
            You mean how AMD bruteforces raytracing and matrix operations with shader cores, while Nvidia has hardware acceleration for that?

        • 1 month ago
          Anonymous

          Intel GPUs cant play old games, which removes one of the biggest advantages of PC gaming. No such problem on Nvidia and AMD.

          • 1 month ago
            Anonymous

            AMD is only slightly better than Intel when it comes to old games, they tend to perform like shit in anything that isn't Vulkan or DX12

          • 1 month ago
            Anonymous

            Why is that? Shouldn't dxvk make it work fine?

    • 1 month ago
      Anonymous

      Then add it, it's only one of the most popular games ever backed by one of the biggest tech companies in the world

      I just use Zink. Works on Windows too.

      • 1 month ago
        Anonymous

        Zink is great for Windows AMD users, fixes the OpenGL performance problems.

    • 1 month ago
      Anonymous

      >into the trash it goes
      it was already in there like everything software related

  2. 1 month ago
    Anonymous

    >impotent nvidia cuck rage
    I don't think about you at all.

  3. 1 month ago
    Anonymous

    Just wait (TM)
    https://community.amd.com/t5/opengl-vulkan/is-mesh-shader-support-planned-for-opengl/m-p/650469

  4. 1 month ago
    Anonymous

    i run intel so this doesn't matter, amd sucks and is for poor third worlders

    • 1 month ago
      Anonymous

      lets see your lspci. are you intelmaxxed?

  5. 1 month ago
    Anonymous

    Minecraft with sodium and a couple of other mods already runs at 800+ FPS on a fricking RX 580, who gives a shit about getting another 200 FPS by using mesh shaders?

    • 1 month ago
      Anonymous

      you don't know what nvidium is for

      • 1 month ago
        Anonymous

        ???
        Yes, it's for adding even more performance to a game that already runs extremely fast on any GPU (provided you have a good CPU which applies to nvidium as well)

        • 1 month ago
          Anonymous

          no it allows for further rendering of chunks at a smooth frame rate
          it makes compium and opticope look obsolete in comparison
          it outperforms bedrock at 96 chunks by a huge margin

          • 1 month ago
            Anonymous

            >allows for further rendering of chunks at a smooth frame rate
            Chunk loading doesn't drop performance on my RX 580
            >It outperforms bedrock at 96 chunks
            Who the frick needs 96 chunks?

          • 1 month ago
            Anonymous

            >Chunk loading doesn't drop performance on my RX 580
            that's a lie
            >you don't need that
            that's cope now frick off

          • 1 month ago
            Anonymous

            >that's a lie
            Yeah sorry I meant it drops from 800 FPS to 700 when loading chunks, total deal-breaker
            >That's cope now frick off
            oh yeah sorry for coping, everyone needs 96 chunks to play Minecraft, who wouldn't want that?
            Your field of view can barely see past 16 chunks but 96 seems very useful
            I'll have to buy a novidio gpu right now

          • 1 month ago
            Anonymous

            still lying and still coping

          • 1 month ago
            Anonymous

            still not providing any argument

          • 1 month ago
            Anonymous

            >the human eye can only see 16 chunks

          • 1 month ago
            Anonymous

            >coping for 96 chunks
            twittard

          • 1 month ago
            Anonymous

            you get more performance at every rendering distance but especially the higher ones
            why are you mad at free performance?

          • 1 month ago
            Anonymous

            nta but i fricking hate its fog of war and none of the pre-renders fricking work without knowing rocket science

          • 1 month ago
            Anonymous

            >Who the frick needs 96 chunks?
            No one, but to be fair it's crazy how limiting is the render distance to your vision in Minecraft and you don't even realize it until you try one of those LoD mods.

          • 1 month ago
            Anonymous

            >outperforms bedrock at 96 chunks by a huge margin
            What the frick does this mean? What the frick is the significance of 96 chunks?

          • 1 month ago
            Anonymous

            educate yourself on minecraft sir

          • 1 month ago
            Anonymous

            pi*(96*16)^2 square meters of rendered terrain. That, multiplied by height, should cover ~2838775693 voxels, although I think it's no longer a cylinder but a sphere, so it's probably off

    • 1 month ago
      Anonymous

      >sodium
      Why the frick do you morons always go for the most obscure shit? OptiFine and Nvidium are way better and more mature.

      • 1 month ago
        Anonymous

        >obscure
        Lol
        >OptiFine
        Nowhere near as effective nowadays

      • 1 month ago
        Anonymous

        you're just old Black person

      • 1 month ago
        Anonymous

        They aren't using Vulkan for mesh shaders? Darn shame since VulkanMod is really good, and a mesh renderer generic or Nvidia specific could probably be added instead.

        Optifine is nowhere as good as Sodium nowadays and Nvidium is a patch on top of Sodium.

        • 1 month ago
          Anonymous

          >Optifine is nowhere as good as Sodium nowadays and Nvidium is a patch on top of Sodium.
          My benchmarks show otherwise.

        • 1 month ago
          Anonymous

          >Nvidium is a patch on top of Sodium.
          No, it replaces majority of the rendering engine, it just uses Sodium as a base, no need to rewrite everything from scratch.

      • 1 month ago
        Anonymous

        Fricking moron. Everybody knows that Optifine is way outdated and it causes multiple fricking crashes everywhere on the board. It's like an autistic Black person was working at a job and then someone makes a slight inconvenience like making some noise; it's going to act genocidal and cause the entire fricking game to crash

  6. 1 month ago
    Anonymous

    It's just polishing a turd. >Java

  7. 1 month ago
    Anonymous

    skill issue. literally.
    but what else to expect from a studio that writes their gayme in java.

  8. 1 month ago
    Anonymous

    Nvidium is being made obsolete by the same dev with a similar performance mod that's agnostic to any GPU that runs opengl 4.6

    • 1 month ago
      Anonymous

      post it

      • 1 month ago
        Anonymous

        not public yet

        • 1 month ago
          Anonymous

          two more weeks guys!"!!!!

          • 1 month ago
            Anonymous

            people have it, just not me

          • 1 month ago
            Anonymous

            >believe me bro

          • 1 month ago
            Anonymous

            I believe him.

          • 1 month ago
            Anonymous

            thats pascal still, 10xx = pascal

    • 1 month ago
      Anonymous

      Sex Reimu

  9. 1 month ago
    Anonymous

    Nobody sane even buys AMD GPUs right now, just not worth it.

    t. doesn't give a crap about the company, just price/performance and feature support

  10. 1 month ago
    Anonymous

    Honestly, I’ve never met an AMD yard who wasn’t an incel.

  11. 1 month ago
    Anonymous

    What exactly is a mesh shader and why can't it be implemented in an AMD GPU?

    • 1 month ago
      Anonymous

      it's a skill issue on amds side

    • 1 month ago
      Anonymous

      it's simply no hardware support

      • 1 month ago
        Anonymous

        >"hardware" mesh shader
        >"hardware" ray tracing
        >"hardware" tensor
        Literally all general purpose compute.

        • 1 month ago
          Anonymous

          do you work for amd?

          cope

          are you beating a 4090 with your 590?mr big boy?

          • 1 month ago
            Anonymous

            >seeing things
            youre mentally ill. or an analphabete

          • 1 month ago
            Anonymous

            Hardware vendor shills should be beaten to death you stupid fricking Black person your thread is moronic and so are you.

          • 1 month ago
            Anonymous

            >gets called out
            >chimps out like the monkey he is
            HAHAHAHAHAHAHHAHAHAHAHAHAHAHAHA

          • 1 month ago
            Anonymous

            How cheap is your labor you pajeet Black person?

          • 1 month ago
            Anonymous

            go back to your big boy applications with your rx 590

          • 1 month ago
            Anonymous

            have a nice day you third world Black person your country should be fricking glassed.

    • 1 month ago
      Anonymous

      AMD does support mesh shaders in hardware, for RDNA-based GPUs at least (2019+).
      However, the only way to use mesh shaders in an OpenGL application at the moment is through the GL_NV_mesh_shader extension.
      AMD's drivers likely don't implement this because their hardware can't (performantly) support everything in the GL_NV_mesh_shader spec. Possibly intentional on Nvidia's part.
      AMD could conceivably create their own mesh shading extension for OpenGL, or push for a vendor-neutral EXT. But OpenGL is not relevant for new games, so they probably don't care to.
      Some more details here on why GL_NV_mesh_shader is weird: https://gitlab.freedesktop.org/mesa/mesa/-/issues/7192

      • 1 month ago
        Anonymous

        >they probably don't care to
        How come nvidia cared to

        • 1 month ago
          Anonymous

          because they don't sell early access gpus

        • 1 month ago
          Anonymous

          because nvidia created it?

          • 1 month ago
            Anonymous

            Why would they create something not relevant for new games

          • 1 month ago
            Anonymous

            because it's fairly old at this point and only being used to generate headline buzz + to make sure games run good only on nvidia

  12. 1 month ago
    Anonymous

    This is how these cheap ass companies market their shit now, they use bargain basement Black folk to argue with morons online.

  13. 1 month ago
    Anonymous

    >rx 590 aka rx 580 EXTRA CRISPY edition

  14. 1 month ago
    Anonymous

    no not my heckin incelidium mods for my kiddiecrafty grooming box!!

    • 1 month ago
      Anonymous

      dude broh I gapped this nvidia gpu with my 590 so bad

      • 1 month ago
        Anonymous

        >DOCH

  15. 1 month ago
    Anonymous

    Vendor extensions also exist in Vulkan and Dx12. Not really unique to OpenGL.

  16. 1 month ago
    Anonymous

    Lot of AMD fanboy seething and coping going on in this thread.
    What makes someone so buttmad about a product/company?

    I don't give a shit about Nvidia or AMD, just use the damn GPU if you need it.

    • 1 month ago
      Anonymous

      >What makes someone so buttmad about a product/company?
      they are poor and got memed into buying an amd card

  17. 1 month ago
    Anonymous

    >mesh shaders in minecraft
    Gotta love modders https://www.youtube.com/watch?v=LX3uKHp1Y94

    • 1 month ago
      Anonymous

      >she

      • 1 month ago
        Anonymous

        I don't care about trooners if they shit out 900chunk rendering distance at 300fps in between their dilation
        She's an honorary woman

  18. 1 month ago
    Anonymous

    WHAT DO YOU MEAN NVIDIUM DOES NOT WORK ON AMD

    • 1 month ago
      Anonymous

      IT'S SO FRICKING OVER AMD BIG BOYS

  19. 1 month ago
    Anonymous

    I DON'T WANT YOUR NVIDIAIDS ON MY ALL AMD RIG ANYWAY homosexualS

    • 1 month ago
      Anonymous

      >

      >then add it
      It is not that easy. Not only minecraft uses opengl but old one. Migrating from OpenGL 4.2 to Vulkan is quite "direct" imo, there's more boilerplate but you gain more fine control over the command buffer (thus, making it more parallel, predominant theme on opengl 4 in the first place), but from OpenGL 3.2 (current minecraft version as far as i know) there's so many things that should be changed that you will pretty much rewrite the engine from the ground up. Not feasible, they would had to dedicate an ENTIRE YEAR just to rewrite that shit again with the whole dev team working on this.

      That's why i hate those "use vulkan" comments. They don't actually know what they are asking for and how shit it is to just "use vulkan". When you are using OpenGL or DirectX, you are just using a graphics API provided by the OS, when you are using vulkan, you are almost making an entire graphics DRIVER.

      Capcha: HANGM.

      >just one more driver

      • 1 month ago
        Anonymous

        It probably took you 3x the wattage just to post that pic meanwhile I am enjoying my perfectly cool and quiet pc.

        • 1 month ago
          Anonymous

          >It probably took you 3x the wattage just to post that pic
          true my next pc is going to be intel + nvidia again

          • 1 month ago
            Anonymous

            >Intel
            kek

          • 1 month ago
            Anonymous

            That was my last pc, it still works fine maybe I could sell it to you as vintage electronics.

          • 1 month ago
            Anonymous

            >Intel
            kek

            That was my last pc, it still works fine maybe I could sell it to you as vintage electronics.

            also
            >PCIe v4.0 x16 (16.0 GT/s) @ x16 (8.0 GT/s)
            I have to run the first PCIe Gen 4 GPU at Gen 3 because the driver crashes otherwise. AMD is just unbeatable at this.

            >Intel
            kek

            my system uses more power than a 14900K + 4090 while idling thanks to AMD

          • 1 month ago
            Anonymous

            >my system uses more power than a 14900K + 4090 while idling thanks to AMD
            Funny, a 7850X3D takes less power than a 14900K (no-OC).

            >I have to run the first PCIe Gen 4 GPU at Gen 3 because the driver crashes otherwise. AMD is just unbeatable at this.
            That's what you get for buying a AMD GPU.

          • 1 month ago
            Anonymous

            >Funny, a 7850X3D takes less power than a 14900K (no-OC).
            not while Idling

          • 1 month ago
            Anonymous

            >not while Idling
            True ture, the Ryzen chip mentioned idles around ~5W higher.
            But under load the Intel cjhip mentioned takes almost 150W more in the worst case for the same performance.

            So I'd rather have it idle 5W higher but take 50%-60% of the power of the Intel CPU rest of the time. Even in silly scenarios like gaming, not even full blown multicore loads.
            When I don't use my PC it's sleeping anyways.

          • 1 month ago
            Anonymous

            bro, are you seriously defending intel-aviv and their decade-old architecture?
            if they wont move their ass, aymmd will never unfrick their shit [insert r7000 silicon delaminating here + the fact they run hot as all hell{did i mention 10c increase in temps halves your component longevity?(as a rule of thumb)}])

          • 1 month ago
            Anonymous

            bro, are you seriously defending intel-aviv and their decade-old architecture?
            if they wont move their ass, aymmd will never unfrick their shit [insert r7000 silicon delaminating here + the fact they run hot as all hell{did i mention 10c increase in temps halves your component longevity?(as a rule of thumb)}])

            (cont.)
            or maybe thats the fricking plan
            turn the market into three segments:
            poorgays - aymd
            rich daddy kiddies - ngreedia
            pro users - both are equally good

  20. 1 month ago
    Anonymous

    But AMD support mesh shaders since RX6000 series, Alan Wake 2 use it.

    • 1 month ago
      Anonymous

      >Alan Wake 2
      >OpenGL

      • 1 month ago
        Anonymous

        Oh, its just opengl implementation, not directx, my bad

  21. 1 month ago
    Anonymous

    >https://www.teamblind.com/browse/AMD-Layoffs-74083

    OH NO NO NO NO AMD BROS

    OUR DRIVERS ARE SHIT SO LETS LAYOFF OUR DRIVER TEAM

    • 1 month ago
      Anonymous

      THEY ARE THE PROBLEM
      Generally, the problems AMD has are the problem s that companies that grow fast have, but they catching up anyway

  22. 1 month ago
    Anonymous

    >OpenGL
    Didn't think ppl would play incoming in 2024

  23. 1 month ago
    Anonymous

    anyone with half a brain knows nvidia is better for production and gaming and that nvidia pioneers features while amd has to play catch up. amd users dont care, they willing have a terrible experience because they wanted to save a few bucks. they ate up all the marketing from youtubers saying amd is totally comparable to nvidia and that fsr totally isnt complete dog water

  24. 1 month ago
    Anonymous

    Nvidium is crazy fast, it's ridiculous.

    • 1 month ago
      Anonymous

      Why are AMD users missing out on everything?

      First DLSS vs FSR (upscaling) where FSR didn't even give remotely similar results and still doesn't.
      All those DLSS mods for games that don't support it... the recent DLSS FG to FSR3 FG mod, an AMD tech but the mod only words on Nvidia GPUs... (Yes I know about the paid mod for AMD GPUs but it's broken, meanwhile the Nvidia mod works close to DLSS FG quality and performance).
      Now this... not to mention the drivers in general (on Windows)... *Sigh*.

      • 1 month ago
        Anonymous

        amd user here
        im not missing out on cheap shit
        if dlss is worth 1000 usd for you, go ahead
        but my sensibilities are slightly different so to speak

        • 1 month ago
          Anonymous

          >if dlss is worth 1000 usd for you
          even my whole system didn't cost that much as I'm happy with access to DLSS

          • 1 month ago
            Anonymous

            A 4070 (non-Ti, non-Super) can be had for under 600€ new...
            Sure the equaling AMD card (RX 7800 XT) costs 50€ less, but neither is 1000 USD or Euro. Not to mention lower end offerings for the casual gamer.

            Also 4070 Super costs 13% more on average and has on average 12% better performance than a RX 7800 XT.
            Meanwhile the next up card from AMD, RX 7900 XT, costs over a 100€ more than a 4070 Super and still beats the RX 7900 XT (no DLSS, no cheating).

            It's a no-brainer to see which GPU to pick.
            Information is from public benchmarks and current retail prices (actual).

            you do you
            my concern is computing power.
            thats expressed in raw "rasterization performance"
            i write my shit at a very low level. logical operations like 'and' 'or' 'not', you know the drill.
            i would have to engineer a solution to use matrix operator circuits that are inside the fancy "ai acceleration" to utilize them. if there only exists a framework to do so (and it should. its just that i would need to hack a fricking raytracer to do my bidding)

            this part concerns

            A 4070 (non-Ti, non-Super) can be had for under 600€ new...
            Sure the equaling AMD card (RX 7800 XT) costs 50€ less, but neither is 1000 USD or Euro. Not to mention lower end offerings for the casual gamer.

            Also 4070 Super costs 13% more on average and has on average 12% better performance than a RX 7800 XT.
            Meanwhile the next up card from AMD, RX 7900 XT, costs over a 100€ more than a 4070 Super and still beats the RX 7900 XT (no DLSS, no cheating).

            It's a no-brainer to see which GPU to pick.
            Information is from public benchmarks and current retail prices (actual).

            specifically
            we have different use cases. basically compute shit is hacking gpus which only incidentally happen to be good for the job to do supercomputer computation but when youre a poorgay

        • 1 month ago
          Anonymous

          A 4070 (non-Ti, non-Super) can be had for under 600€ new...
          Sure the equaling AMD card (RX 7800 XT) costs 50€ less, but neither is 1000 USD or Euro. Not to mention lower end offerings for the casual gamer.

          Also 4070 Super costs 13% more on average and has on average 12% better performance than a RX 7800 XT.
          Meanwhile the next up card from AMD, RX 7900 XT, costs over a 100€ more than a 4070 Super and still beats the RX 7900 XT (no DLSS, no cheating).

          It's a no-brainer to see which GPU to pick.
          Information is from public benchmarks and current retail prices (actual).

          • 1 month ago
            Anonymous

            >hacking gpus which only incidentally happen to be good for the job to do supercomputer computation but when youre a poorgay
            yeah, esl moment
            i mean that when youre a poorgay
            a gpu is a supercomputer
            but it wasnt deigned for that
            so theres always hacky shit involved
            and since amd offers the best compute
            for a poorgay scientist
            amd is the best bet
            esp. when youre mid in the subject (again, true GODS take whatever piece of equipment and just make it do their bidding.)

          • 1 month ago
            Anonymous

            (>true GODS: they will take a dozen fricking calculators and make doom:eternal run on it. i dont mean running gpus for number crunching qualifies)

          • 1 month ago
            Anonymous

            buy a 3060

          • 1 month ago
            Anonymous

            it offers no discernible advantage over an rx 590 compute-wise.
            im memory-bound which means the only metric that matters to me is memory bandwith, since its the limiting factor in my program.
            lets say you do a+b.
            you need to fetch the value of a, and the value of b to do your calculation.
            and in my program, the fetching part is what takes the most time.
            think of it as if youre physically moving ones and zeroes from a place to another (bc thats what you actually do)

            >a gpu is a supercomputer
            >but it wasnt deigned for that
            Except when it literally was.

            yes, no?
            crunching number, yes, 10 years ago
            but today you got all the fancy physical circuits to facilitate computation related to graphics.
            truly compute solutions dont even have video output nowadays.
            so is it?

          • 1 month ago
            Anonymous

            >yes, no?
            >crunching number, yes, 10 years ago
            >but today you got all the fancy physical circuits to facilitate computation related to graphics.
            >truly compute solutions dont even have video output nowadays.
            >so is it?
            Is this bait?

          • 1 month ago
            Anonymous

            no
            i know full well theres me, theres ONE (1) anon who does cuda
            and until proof to the contrary
            anyone who speaks about either know it from second hand experience so to speak

          • 1 month ago
            Anonymous

            >a gpu is a supercomputer
            >but it wasnt deigned for that
            Except when it literally was.

          • 1 month ago
            Anonymous

            it offers no discernible advantage over an rx 590 compute-wise.
            im memory-bound which means the only metric that matters to me is memory bandwith, since its the limiting factor in my program.
            lets say you do a+b.
            you need to fetch the value of a, and the value of b to do your calculation.
            and in my program, the fetching part is what takes the most time.
            think of it as if youre physically moving ones and zeroes from a place to another (bc thats what you actually do)

            [...]
            yes, no?
            crunching number, yes, 10 years ago
            but today you got all the fancy physical circuits to facilitate computation related to graphics.
            truly compute solutions dont even have video output nowadays.
            so is it?

            (cont.)
            we just happen to live in a moment when consumer equipment overlaps with the needs of compute

      • 1 month ago
        Anonymous

        AMD is competing against two single purpose dedicated (intel gpus don't count) hardware companies at the same time and beating one of them. If they beat both of them you would b***h about monopoly.

        • 1 month ago
          Anonymous

          AMD is already doing any consumerist tactics now that their CPUs are better than Intel's.

  25. 1 month ago
    Anonymous

    minetest is better anyway

    • 1 month ago
      Anonymous

      no..its not. I appreciate it being around though.

    • 1 month ago
      Anonymous

      I heard Minetest has no mobs. What mods should I install if I want to play Minetest in single player survival mode just like Minecraft?

      • 1 month ago
        Anonymous

        Minetest is not a game, it's a platform, If you want to play an almost minecraft game you need to install mineclone 2 on it

  26. 1 month ago
    Anonymous

    >going with AMD OpenCL over CUDA
    Hahahahahaahah

  27. 1 month ago
    Anonymous

    Just use Zink then

    • 1 month ago
      Anonymous

      this

      Does Zink fix it?

      zink helps with AMD on Windows

    • 1 month ago
      Anonymous

      Does Zink fix it?

      You can try Sodium+Zink with AMD, depends on your configuration, it can help.
      Helps with Nvidia too without Nvidium, but Nvidium does a better job, having both Zink+Nvidum gives a worse time than just Nvidium.

      • 1 month ago
        Anonymous

        >Helps with Nvidia too without Nvidium, but Nvidium does a better job, having both Zink+Nvidum gives a worse time than just Nvidium.
        I get 380 FPS with Nvidium and 470 FPS with Nvidium+Zink tho.

  28. 1 month ago
    Anonymous

    Does Zink fix it?

    • 1 month ago
      Anonymous

      Zink doesn't implement GL_NV_mesh_shader. Refer to https://gitlab.freedesktop.org/mesa/mesa/-/issues/7192

  29. 1 month ago
    Anonymous

    Imagine trying to play a 3d retro-themed game about brown bricks made in 2011, but you can't anymore because your 4yo GPU doesn't support some obscure instruction that no one expected to become relevant.

  30. 1 month ago
    Anonymous

    Imagine your minecraft runs at 1200fps and not 3000fps because your gpu supports an instruction but the manufacturer doesn't implement a vendor specific instruction to a deprecated graphics library just so minecraft morons don't have to use vulkan instead of pooGL

    • 1 month ago
      Anonymous

      it is THAT easy moron there's a handful of vulkan renderers for minecraft done in free time by some suicidal troons let alone a 4 trillion dollar company

      >mesh shaders in minecraft
      Gotta love modders https://www.youtube.com/watch?v=LX3uKHp1Y94

      Well, if microsoft wants to kill minecraft without new updates for an entire year, maybe they have resources to do it. I was from a time that Notch still owned Mojang, i've waited longer periods without update and i prefer it that way, everyone have time to get on the same page, etc, but the zoomers today are too addicted in dopamine for that, and there's too less of us that would endure such a wait.

      >Who the frick needs 96 chunks?
      No one, but to be fair it's crazy how limiting is the render distance to your vision in Minecraft and you don't even realize it until you try one of those LoD mods.

      Minecraft with sodium and a couple of other mods already runs at 800+ FPS on a fricking RX 580, who gives a shit about getting another 200 FPS by using mesh shaders?

      Friendly reminder that when they talk about "minecraft" they refer to the shitty c copy previously known as bedrock edition before microshaft started calling it just minecraft

      • 1 month ago
        Anonymous

        >fail at C
        ugh, how you even do that?

      • 1 month ago
        Anonymous

        >Friendly reminder that when they talk about "minecraft" they refer to the shitty c copy previously known as bedrock edition before microshaft started calling it just minecraft
        moron
        Java version is still the dominate version on PC.

        • 1 month ago
          Anonymous

          If you meant "Dominating" then just barely, and decreasing fast.
          If you meant something along the lines of the word "Denominated" then no, since 1.12 they have been very aggressive in calling it java edition every time they talk about in, in any case bedrock edition is by far the most popular one, and the one that would get most of the attention if not for the shitshow that would ensue given the fact that a lot of mojang devs are just players(of the REAL version, not bedrock) that they hired because they did something cool in the game and the youtubers that only play java.
          also Just because almost all youtubers play it doesn't mean most players do.

          >fail at C
          ugh, how you even do that?

          They rushed into halfassedly copying the java version, while not implementing the (de jure)bugs in java but also implementing a shitton of other bugs that actually make the game unplayable, as opposed to the java "bugs" that allow for cool redstone. And because it's a completely alien ecosystem the mods for java designed to actually fix the wrongs in the original (which then subsequently end up being added into the game) simply cannot exists in bedrock

          Nvidium is a Java mod

          Yes, I actually didn't read the full image before starting ranting but I already started and I'm not gonna let that stop me, and I don't have any proof but I'm willing to bet it's not nearly as effective as it portrays itself as, it's like the early days of soduim when they talked about 10x the performance of optifine when in most cases is 2x

          • 1 month ago
            Anonymous

            at a C level its all bits and bytes
            if they failed its because they didnt try hard enough
            especially C isnt an arcane ass language
            its used for onboarding for CS, exactly bc its so simple
            really
            the hard part in C is keeping track of things.
            its 100% surmountable with discipline

          • 1 month ago
            Anonymous

            >I'm willing to bet it's not nearly as effective as it portrays itself as, it's like the early days of soduim when they talked about 10x the performance of optifine when in most cases is 2x
            Nvidium is crazy performant though. You have to try it to believe, I get it, I had my doubts too.

          • 1 month ago
            Anonymous

            The very concept of a logic contraption in bedrock edition is challenged by the fact that they use RANDOMNESS to determine where the signal is going to go, in java edition you can tell with 90% precession (depending on the coordinantes, so it can be 100% with good placing) all the events that will happen in a single tick allowing you to make incredibly fast redstone, on the other hand bedrock's redstone is not only slower as a "feature", but it also completely forbids any meaningful way of working in subtick time which makes big complex contraptions such as computers completely useless, they do have some really useful features, I'm not going to deny that, but they are useful for exclusively one type of redstone (slimestone) and all the other types have to suffer instead, just because the homosexuals at microshaft did EXACTLY, WORD BY WORD what cubehampster had predicted years before, not only that but they dared to call what he said a nonsensical slippery slope. so far everything he said (back in 1.10 iirc) has come true, and it will keep going so until the ORIGINAL version of the game is completely castrated and sterilised to look more like bedrock, instead of them actually fixing the absolutely fricking Black personlicious insane pathetic piece of double stinking shit that is bedrock

            >I'm willing to bet it's not nearly as effective as it portrays itself as, it's like the early days of soduim when they talked about 10x the performance of optifine when in most cases is 2x
            Nvidium is crazy performant though. You have to try it to believe, I get it, I had my doubts too.

            I don't have a nvidia card, I will try to at least investigate it once it gets out of beta tho, if it's really as good as it says I might even consider buying one

            at a C level its all bits and bytes
            if they failed its because they didnt try hard enough
            especially C isnt an arcane ass language
            its used for onboarding for CS, exactly bc its so simple
            really
            the hard part in C is keeping track of things.
            its 100% surmountable with discipline

            They failed because they tried to copy an already existing software (and an very big one at that) several different times (pocket, win10, consoles (playstation/xbox and the others were all independent from each other, thats why ps3 ver was left behind)) and then merged them together in a big cancerous mess
            not only that but they had to kill them all to get it, pocked edition used to be the GOAT, as good as the ORIGINAL minecraft

          • 1 month ago
            Anonymous

            >C and meincraft fail
            i stand corrected then.
            the portability of C is a myth;
            i didnt know consoles were involved
            win/loonix is manageable
            but when you go into consoles, thats different HW architectures, and then C's portability myth completely breaks down

          • 1 month ago
            Anonymous

            Nu-Minecraft is C++, not C.
            The portability of Java is a myth. Nu-Minecraft was successfully ported to multiple consoles, not to mention phones.
            It would have taken 100x more engineering effort to make the Java version run on iOS or Switch at >10fps than it took to rewrite the whole thing in C++.

          • 1 month ago
            Anonymous

            ok then, i guess
            i am not gonna be looking up the source of code of minecraft so i its up to you to duke it out
            (also: C/C++ really doesnt make a big difference in the context of consoles bc youre working with custom everything, so even if you code C++, youre still using primitive types like pointers and such)

          • 1 month ago
            Anonymous

            Why do you feel so confident in opining on these things if you're not a programmer?

          • 1 month ago
            Anonymous

            well, thats because i program since 13 years now.
            so i guess im a programmer after all

          • 1 month ago
            Anonymous

            How can you write such stupid replies if you are a programmer? What have you been doing for the past 13 years?

          • 1 month ago
            Anonymous

            >stupid replies
            tell me, how? whats stupid about what i just wrote. lets hear it

          • 1 month ago
            Anonymous

            It's not my job to correct you, shitlord.
            That said, C is the most portable language you can write anything in, so I fail to understand how you could compose

            >C and meincraft fail
            i stand corrected then.
            the portability of C is a myth;
            i didnt know consoles were involved
            win/loonix is manageable
            but when you go into consoles, thats different HW architectures, and then C's portability myth completely breaks down

            if indeed you have been programming for 13 years.

          • 1 month ago
            Anonymous

            >im right
            >but i wont tell you why
            mkay i guess
            go be right somewhere else

            >That said, C is the most portable language you can write anything
            "hello world" doesnt count
            come back when you have dealt manually with endianness

          • 1 month ago
            Anonymous

            Smooth move. Just pretend to be too dumb to argue with and you win by default.
            Riddle me this at least, what language would you write a cross-platform game in instead of C++?

          • 1 month ago
            Anonymous

            fuuuck
            i have no fricking idea
            C i guess
            and by cross platform we mean cross-architecture

            C bc i fricking hate C++ bc its moronic
            and because youre gonna code in C anyways
            but you need to literally reimplement the whole fricking engine each fricking time, for each hardware

            even if somehow you can use high level shit like opencl and such, your hw constraints are different. so the manner you access your memory will most likely be different.
            it might seem like not a big difference, but if theres any optimization in it, thats pretty much a full rewrite
            or you end up with a sorry mess

          • 1 month ago
            Anonymous

            All relevant platforms for videogames are x64 or arm64 now. The last "weird" console was the 3DS, which died sometime in 2018. You don't really have to care about the CPU anymore.
            The big problem is GPUs. It's a real alphabet soup out there. Nu-Minecraft has D3D11, D3D12, Xbox-D3D12, OpenGLES, Metal, GNM and NVN renderers. More serious games do Vulkan and AGC as well.
            And that's just the APIs, you also need to optimise for all the GPUs your game is going to run on, which range from mostly-normal (GCN with weird buses) to bug-infested nightmare fuel (Adreno/Mali).
            The other big problem is that you can't do JIT on consoles and iOS, so if you want to write that renderer in anything fancier than C/C++ you're not gonna have a good time.

          • 1 month ago
            Anonymous

            after due consideration this really sounds like corporate pushing either for time limits, or team rotation
            somethign is fricky in that story

            you can do a good job in C.
            you can do all jobs in C
            but it takes time and a good team
            corporate must have pushed for a lack of either or both

          • 1 month ago
            Anonymous

            >That said, C is the most portable language you can write anything in
            That's one of the biggest mistakes newbies can make. C is not portable in any way that you'd actually care about. Well written C will *compile* with any decent compiler on any platform, but it isn't guaranteed to give you the same results. C was designed as a thin wrapper for the hardware. Whenever there was a question as to whether they should define the behaviour exactly or leave it up to the implementation, they left it up to the implementation. This was brilliant for performance because you'd never have poor fitting CPUs running extra cycles just to get perfect accuracy to some other CPU's method of bitshifting or overflow handling, but it's terrible for people who want to write once and compile and run everywhere.
            Endians, bit shift, overflows, struct packing, alignment, hell even sizeof(short/int/long) can and does vary platform to platform and all the spec has to say on the matter is "check with your compiler documentation."
            If you don't believe me, there are bugs with source ports of Quake on ARM cpus because the code was designed around Intel CPUs and expects 80bits of precision on floats and specific MUL/DIV handling that ARM CPUs do differently. This manifests as odd physics, getting stuck on slopes and such things.

          • 1 month ago
            Anonymous

            You're overstating how many platform quirks there are today and how much effort they take to deal with, compared to how long it'll take you to get a blob of Java or whatever the frick running on a console at reasonable speed. C/C++ is still your best bet for portability, in the particular context we were discussing in this thread.

          • 1 month ago
            Anonymous

            [...]
            i meant up to you both to duke it out
            meinkraft: C or C++?

            Yes, sorry it was cpp, but it doesn't change the fact that it's a blasphemous abomination that steems from trying to merge several different forks and rewrites wich are into themselves a port of a java program into the single amoebous mess that is bedrock edition
            also it was never "ported" to phones, thats the point of this

            The worst part is that they didn't even have a reason to do so, they could at least have launched their """better""" together update as a completely independent software without the need to destroy their second best edition of the game, and they should have because BEDROCK DOESN'T FILL THE SAME NICHE AS POCKED EDITION DID as in a very light and performant(inb4 >>b-but bedrock is preformant too. No rendering only a third of the screen at 60 fps with 64 chunks with 1 whole tps is not performant, it's just shifting the load) but instead they filled it with bullshit like micro transactions (There were purchasable skins previously iirc but you could have just downloaded them from the internet) and forcing you to be connected to their shitty spyware microshaft servers every time you launch the game because "piracy concerns(which aren't even solved with it, if you connect with a pirated apk you can play just fine, connecting to their servers is literally useless outside of being spyware)", I have bought minecraft 7 times in total (3 ORIGINAL VERSION (one of them for someone else), 2 ps3, 1 bedrock(worst mistake of my life), and 1 pocket edition), minecraft is the ONLY software I ever payed for in my life, notch himself told someone to pirate the game to test it and if there has ever been a game good enough to convince a diehard pirate to but it that is minecraft, so even those so called "piracy concerns", are not only not solved in the slightest, but they actually ended up ruining the second best version of the game. That and also the fact that they focused a lot of attention that could have gone to the ORIGINAL version instead, and that ultimately ended in the catastrophic failure that was 1.17/1.18/(1.19/1.20)? (dunno, haven't played them much yet) instead of being a single update the gaylords at microshaft decided to siphon all the money meant for the ORIGINAL version to make an even shittier iteration of bedrock edition

            post, PE was just rushedly rewritten to be meant as a MINECRAFT-LIKE GAME, *NOT* TO BE MINECRAFT. Pocket edition is where the original bedrock codebase was born, and then they had to mangle all the other versions (for example the legacy console codebase, wich was older had to be entirely dumped) and itself to accommodate the unneeded merge between them

          • 1 month ago
            Anonymous

            now thats some solid lore.
            'sounds like corporate meddling. even tho, from a high level perspective, it makes sense to try to merge all various codebases into one unified branch
            'doesnt meant its feasible tho...
            in absolute, it is, but it is dependent on the team youre working with
            its easy to forget the human factor in these considerations

          • 1 month ago
            Anonymous

            >sounds like corporate meddling.
            Brought to us from none other than M$
            And not only that the team that did it was completely separate to the actual mojang java team, up to the point it was easier to reverse-engineer the game than to ask them, thats why the end result is so different and shitty

          • 1 month ago
            Anonymous

            Nu-Minecraft is C++, not C.
            The portability of Java is a myth. Nu-Minecraft was successfully ported to multiple consoles, not to mention phones.
            It would have taken 100x more engineering effort to make the Java version run on iOS or Switch at >10fps than it took to rewrite the whole thing in C++.

            i meant up to you both to duke it out
            meinkraft: C or C++?

          • 1 month ago
            Anonymous

            The reason deiter of bedrock was deprecated once you could run java mc in graalvm
            It doesn't need to exist

          • 1 month ago
            Anonymous

            >reason deiter

      • 1 month ago
        Anonymous

        Nvidium is a Java mod

      • 1 month ago
        Anonymous

        No? Bedrock doesn't have LoD mods

  31. 1 month ago
    Anonymous

    AMD gpu runs much better on Linux anyway so they cancel each other out.

  32. 1 month ago
    Anonymous

    Remember when the only same choice for Linux was Nvidia

    • 1 month ago
      Anonymous

      still is

    • 1 month ago
      Anonymous

      Yeah and it sucked donkey dicks.
      fglrx sucked horse dick. Yeah, proprietary nvidia drivers sucked less, but amdgpu was such a game changer and I'm not going to pretend like leather jacket guy's shit is even in the same ballpark.

  33. 1 month ago
    Anonymous

    The worst part is that they didn't even have a reason to do so, they could at least have launched their """better""" together update as a completely independent software without the need to destroy their second best edition of the game, and they should have because BEDROCK DOESN'T FILL THE SAME NICHE AS POCKED EDITION DID as in a very light and performant(inb4 >>b-but bedrock is preformant too. No rendering only a third of the screen at 60 fps with 64 chunks with 1 whole tps is not performant, it's just shifting the load) but instead they filled it with bullshit like micro transactions (There were purchasable skins previously iirc but you could have just downloaded them from the internet) and forcing you to be connected to their shitty spyware microshaft servers every time you launch the game because "piracy concerns(which aren't even solved with it, if you connect with a pirated apk you can play just fine, connecting to their servers is literally useless outside of being spyware)", I have bought minecraft 7 times in total (3 ORIGINAL VERSION (one of them for someone else), 2 ps3, 1 bedrock(worst mistake of my life), and 1 pocket edition), minecraft is the ONLY software I ever payed for in my life, notch himself told someone to pirate the game to test it and if there has ever been a game good enough to convince a diehard pirate to but it that is minecraft, so even those so called "piracy concerns", are not only not solved in the slightest, but they actually ended up ruining the second best version of the game. That and also the fact that they focused a lot of attention that could have gone to the ORIGINAL version instead, and that ultimately ended in the catastrophic failure that was 1.17/1.18/(1.19/1.20)? (dunno, haven't played them much yet) instead of being a single update the gaylords at microshaft decided to siphon all the money meant for the ORIGINAL version to make an even shittier iteration of bedrock edition

  34. 1 month ago
    Anonymous

    not only that but if you go back to the early versions of Pocket Edition you would see IT'S NOT MEANT TO BE THE SAME GAME, it's almost like a demo++ with a unique twist, it's meant to be something similar to minecraft, not minecraft, the fact that microshaft confused that has been the worst disaster to hit mankind in the last decade

    • 1 month ago
      Anonymous

      alright pocket baby go back to your little phone game and tik tok

      • 1 month ago
        Anonymous

        It was not a phone game, it was THE phone game, before the only things you could play in your phone were angry birds or some shitty unkown lowpoly 3d game, minecraft PE was the most succesfull 3d game for smartphones, not that I care tho, my Samsung Galaxy Y (2011) was (and still is, because it still works and better than any phone today) NOT a gayming device, because smartphones shouldn't be used to play that kind of games in the first place, I only ever played it in my tablet
        Also don't misunderstand me, I never said Pocket edition was better than the ORIGINAL minecraft, but it was the best of the alternatives

  35. 1 month ago
    Anonymous

    Vega was a mistake.

    • 1 month ago
      Anonymous

      vega is fricking awesome
      consooming was a mistake
      when ayymd were dropping rx 590 2 years after releasing them you went like "ngh, poorgays"
      so shut the frick up now. i guess.

      • 1 month ago
        Anonymous

        over hyped and underdelivered just like everything from amd

        • 1 month ago
          Anonymous

          'bought it for 170 bucks end of cycle.
          seethe.

  36. 1 month ago
    Anonymous

    Remember to set OpenGL to Prefer Performance in NVCP too. Will raise your .1 and 1 percentiles like 30%.

  37. 1 month ago
    Anonymous

    Draw distance alone doesn't mean shit. World of Warcraft has continued to increase its draw distance, but players and npcs only spawn in a out 40 yards in front of you so the world looks dead and empty.

    • 1 month ago
      Anonymous

      That's just a matter of CPU limitations
      The game already has a slider for simulation distance and asiaticbench scores are doubling every 6 years or so.
      Don't yell it me it needs to scale with the square of the distance or whatever i haven't done the math

    • 1 month ago
      Anonymous

      It's not like you'll be able to interact with anything further than 20 chunks away anyway.

  38. 1 month ago
    Anonymous

    What about the open-source driver on Linux

  39. 1 month ago
    Anonymous

    A quick net search told me AMD does support "mesh shitters" KYS homosexual

  40. 1 month ago
    Anonymous

    Typical EEE.

    • 1 month ago
      Anonymous

      more like skill issue

  41. 1 month ago
    Anonymous

    The latest tests use a 13900K, but they shouldn't update that relative performance table.

  42. 1 month ago
    Anonymous

    i didn't know this nvidium mod existed.

    with a 1060 6gb i usually get like 200-300fps on the latest versions of minecraft. i wonder how much nvidium would increase that.

    • 1 month ago
      Anonymous

      It's certainly useful when you add shaders

    • 1 month ago
      Anonymous

      I think you need a turing or newer card for this mod, it won't work on a 1060.

      • 1 month ago
        Anonymous

        that's gay.

      • 1 month ago
        Anonymous

        >it won't work on a 1060.
        what about a 1070 then, hmm?

  43. 1 month ago
    Anonymous

    There's a guy doing voxel shaders replicating minecraft on vr chat in unity.
    Voxel rendering is way more optimizable than it used to be - apparently the game Teardown used something like this to vastly improve performance of voxel rendering and now it's common practice and even now spills over to minecraft.

    • 1 month ago
      Anonymous

      Teardown raymarches through a per-object 3D texture in the fragment shader. These mesh shader renderers seem a lot more traditional, just very fast.

  44. 1 month ago
    Anonymous

    mesh shading can be emulated using compute shaders with a minimal impact on performance. It's not as optimal as mesh shaders but it's good enough and it's the cross platform solution, proper mesh shading support will be widespread in ~5 years but its not ready yet.

    • 1 month ago
      Anonymous

      5 years is a lot of time for a videogame

  45. 1 month ago
    Anonymous

    why does minecraft need mesh shaders? Its 13 years old its fine to stop now.

    • 1 month ago
      Anonymous

      It's a good game

  46. 1 month ago
    Anonymous

    lol...proprietary buzzword "technology" from Company A does not exist on hardware from Company B. No shit?

Your email address will not be published. Required fields are marked *