Imagine still using DDR4.

Imagine still using DDR4.

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

  1. 2 years ago
    Anonymous

    whats the point of FPS that exceeds your monitors' refresh?

    • 2 years ago
      Anonymous

      Who the frick uses a monitor that's at least 120Hz these days?

      • 2 years ago
        Anonymous

        Most people.

    • 2 years ago
      Anonymous

      i forgot but with gsync you wont get tearing and it if runs above you get lower latency or input sorry vague memory

    • 2 years ago
      Anonymous

      Lower latency, with tearing you can receive information on at least part of the screen much more quickly than your monitor's refresh rate would allow with vsync
      See: CSGO running at many multiples higher fps than most monitors, go ahead and try moving your mouse around with a 200fps vs 400fps cap in console

    • 2 years ago
      Anonymous

      What's the point in more than 16 colors?

    • 2 years ago
      Anonymous

      Latency.

    • 2 years ago
      Anonymous

      I know a dude playing csgo in 600 fps on a 120hz monitor.

    • 2 years ago
      Anonymous

      every frame the game loop checks the user input so the higher the FPS the faster your inputs are being accounted for.
      At 60 FPS it takes at ~16.6 ms for your input to take effect but at 200 FPS it would only take 5ms
      other than in competitive games it's a waste of processing power to render out all those frames you can't see

      • 2 years ago
        Anonymous

        GPU usage also comes into account as well for input lag. Usually around 300-400 fps is the sweet spot for those games unless it runs at 800 fps or higher, stable.

    • 2 years ago
      Anonymous

      https://blurbusters.com/faq/benefits-of-frame-rate-above-refresh-rate/

    • 2 years ago
      Anonymous

      Placebo effect

    • 2 years ago
      Anonymous

      headroom in case of frame drops. but if anything in OP‘s graphic is higher than your refresh rate, you‘re doing shit wrong

  2. 2 years ago
    Anonymous

    Imagine wasting your money on sticks of memory right before a societal collapse just for a 2% increase in speed.

    • 2 years ago
      Anonymous

      I welcome the collapse. Then I'll just go and take those things.

    • 2 years ago
      Anonymous

      I live on a self sufficient LDS compound that I built with my 7 brothers, our 16 wives, and our 47 children. We are going to be OK. good luck Gentile!

    • 2 years ago
      Anonymous

      >2%
      Failed maths?

      • 2 years ago
        Anonymous

        +20% which isn’t enough you fricking clown

    • 2 years ago
      Anonymous

      As long as there has been society, people have been predicting its immanent collapse. I'm not too worried.

    • 2 years ago
      Anonymous

      even during the world wars neither European or East Asian society ever collapsed. The storm may be coming, but it will not be as great as you believe it to be.

  3. 2 years ago
    Anonymous

    I'm on DDR3 and am fine

    • 2 years ago
      Anonymous

      same here.
      4gb ddr3, ati radeon hd5850, i5 2.6, 1tb hdd

      • 2 years ago
        Anonymous

        >same here.
        >4gb ddr3, ati radeon hd5850, i5 2.6, 1tb hdd
        No ssd?

      • 2 years ago
        Anonymous

        you really should get a ssd

  4. 2 years ago
    Anonymous

    >no sub-timings or even frequency shown
    It's still going to be a while before DDR5 is able to outperform high end DDR4 kits. In the end this dick measuring contest only applies to people using high end graphics cards that consume as much power as a microwave oven.

    • 2 years ago
      Anonymous

      Those are all high end kits.

      i forgot but with gsync you wont get tearing and it if runs above you get lower latency or input sorry vague memory

      FreeSync/G-Sync and VRR in general. Plus anon totally ignored 1% numbers, even if they only have a 60 or 75Hz monitor from 12 years ago.

      • 2 years ago
        Anonymous

        They're not the 5,000MHz+ kits that have been coming out recently, are they?

        https://www.pcgamesn.com/fastest-ddr4-ram

        • 2 years ago
          Anonymous

          >Intel only sticks
          Why does this even matter?

          • 2 years ago
            Anonymous

            ...there are no amd ddr5 platforms yet.

          • 2 years ago
            Anonymous

            Anon, we're talking about DDR4.

          • 2 years ago
            Anonymous

            Well some are still stuck on ddr3 (cuz we stupid asf)

  5. 2 years ago
    Anonymous

    Wait, so DDR5 pushes frame rates at 4K independent of the GPU? I thought only graphics card was the bottleneck.

    • 2 years ago
      Anonymous

      Same GPU, the CPU+memory can still be a bottleneck even when they aren't fully utilized, there's more to load than just having 100% used. Same applied to GPUs also.
      Everything is just running more efficiently, so you get to push your GPU even more.

      • 2 years ago
        Anonymous

        This is a 8GB test though. DDR5 speeds could alleviate the low RAM by being able to cycle data faster.

        • 2 years ago
          Anonymous

          I think anons point is that Dual channel 2x8GB DDR4 gets beaten by Dual Channel 2x8GB DDR5 even at 4k

  6. 2 years ago
    Anonymous

    >look, we're 20% better at a function of memory which is never the bottleneck, consume more overpriced shit
    Wrong board

    • 2 years ago
      Anonymous

      Suck my ass, shill.

      you can barely discern the differences after 80fps so lol, enjoy being a homosexual

      COOMSOOM

      You people are so fricking annoying. What’s the point of posting on a board about technology when you only b***h and complain about new things like an old man yelling at a cloud?

      • 2 years ago
        Anonymous

        >no do not think
        >just consoom
        >praise consoomers!

        • 2 years ago
          Anonymous

          You’re annoying as frick dude and your newbie is showing. The consoom meme is about people who replace their personality with things they buy. Like marvel fans that fill their houses with funko pops. Instead of having a hobby, consoooomers watch and buy things. This thread is about new technology. On the board for technology. Technology people use for work, for creative projects, and for gaming. I’m assuming you’re a poor teenage newbie NEET who is new here so you just throw out words to try and fit in, but that’s not what the meme means. Try to keep up and maybe you’ll look like less of a fricking fool.

          • 2 years ago
            Anonymous

            take your meds

        • 2 years ago
          Anonymous

          Do you expect the world to just stay on DDR3 forever? Do you expect technology to just stagnate? You’re a fricking idiot, have a nice day and get off this board
          >NOOO YOU CANT JUST IMPROVE THINGS AND MAKE NEW INNOVATIONS IN TECHNOLOGY!!!
          >IT REMINDS ME THAT IM JUST A SAD POORgay WHO CANT AFFORD IT ANYWAY!!!
          >STOP MAKING ME FEEL BAD ABOUT MY PHENOM 2 AND GTX 760 STOP IT STOP IT!!!!!!!!!

          • 2 years ago
            Anonymous

            ywnbaw

          • 2 years ago
            Anonymous

            take your meds

            If you’re this annoying on an anonymous imageboard I can’t imagine how much of an unlikable gay you are in real life

      • 2 years ago
        Anonymous

        >only b***h and complain about new things

        DDR1
        DDR2
        DDR3
        DDR4
        DDR5 <----- This is new?

        His point is that tech isn't improving, it's just being revised and tweaked. Going from DDR4 to DDR5 will not be noticeable for anyone. If you need benchmarks to reveal a difference that you cannot otherwise perceive, the difference is insignificant.

        For example, even the densest folks will notice the difference in going from a HDD to SSD for your programs drive. Sub 5 sec boots, most programs starting in under 3 sec.

        Going from SSD to NVME...not so much.

        • 2 years ago
          Anonymous

          Yes, it’s call incremental improvements over time. DDR5 doesn’t exist for DDR4 customers, it exists for DDR3 customers that are finally upgrading. Just like how new phones only have minor improvements each year. You’re not expected to upgrade every generation, because the improvements add up over time

        • 2 years ago
          Anonymous

          >+20fps average gain
          >insignificant

          • 2 years ago
            Anonymous

            It's faster in some, slower in others and costs twice as much. That's less than insignificant, that's terrible.

        • 2 years ago
          Anonymous

          Well said anon, I blame the linus youtube crowd myself for the sorry state the pc scene is in now.

        • 2 years ago
          Anonymous

          https://i.imgur.com/syJckA8.png

          Imagine still using DDR4.

          For some reason I was looking at old anandtech articles when ddr memory first became available, and the exact same buyer advice applies today as then. First batch will be low speed, low density; overpriced, and no platforms will be seriously bottlenecked by the cheaper alternative for at least 1.5 years. We saw this with ddr3-800 and ddr4-2133.

          Back then there was also a serious issue of lack of motherboards with ddr support and it wasn't until both Intel and AMD had widely available ddr boards that prices became reasonable.

  7. 2 years ago
    Anonymous

    Suck my ass, shill.

  8. 2 years ago
    Anonymous

    >no speed
    >no timings
    >not a single dual-rank kit
    Imagine being this much of a moronic street shitter.

  9. 2 years ago
    Anonymous

    Are these with an integrated GPU?

    • 2 years ago
      Anonymous

      >any iGPU running a recent game at 4k Ultra at 80 FPS
      if that was true, nobody would give a shit about dGPUs

      • 2 years ago
        Anonymous

        I didn't really look at the actual numbers (I also have never heard of Wonderlands before), I just figured that RAM speed would have the greatest impact on iGPUs. I am surprised to see such a big difference on a dGPU. I wonder if there is a similar difference across other applications/games or if this one just heavily favors ram speed.

  10. 2 years ago
    Anonymous

    you can barely discern the differences after 80fps so lol, enjoy being a homosexual

    • 2 years ago
      Anonymous

      Pretty sad to think that. You really never experience 144Hz or more? Dang.
      I thought even poorgays these days do at least 144Hz + FPS.

  11. 2 years ago
    Anonymous

    I'm still using DDR3 just fine anon.

    • 2 years ago
      Anonymous

      >X5690
      God I hope you're not actually using that for anything serious
      I had a dual X5650 poweredge and my 1700X would run circles around it while consuming significantly less power, and that was already many years ago now

      • 2 years ago
        Anonymous

        I use it for all my Windows needs. It does just fine for everything I've needed of it.

        • 2 years ago
          Anonymous

          Just wastes a lot of power for no reason but okay.

          • 2 years ago
            Anonymous

            The whole machine with three displays uses about 500W on average usage. Down to 300W when idle and around 600W with both the CPU and GPU loaded up. I'm not concerned with it since the machine still works fine. Maybe in a couple years once chip costs come down I'll build a new box but there is no need right now.

            [...]
            That and it makes the 1080Ti mostly useless, talk about mismatching

            The PCIe 2 is more of a detriment than the X5690 is. Both cuda tasks and gaymes do just fine on it. I don't play anything new enough to need more.

          • 2 years ago
            Anonymous

            I'm also on DDR3. Also, power optimization on PC is moronic for 99% of users when you don't reduce electricity bills from HVAC or water heating first, or you're a NEET who doesn't even pay for the electric bill.

          • 2 years ago
            Anonymous

            The power usage a PC uses compared to a fridge, freezer, water heater, or HVAC as you mentioned is negligible. I'm not worried about it being inefficient at this point. The hardware has lasted a decade without issues and still does what I want. Maybe in a couple years I'll build a new machine but at this point, I don't need it.
            The Z400 stays off most of the day and is only on for a couple hours a day after work. Weekends it's on for most of the day but that's only two days a week.
            My M900 tiny shitpost box is on 24/7 since it uses so little power especially idle it's cold to the touch.

          • 2 years ago
            Anonymous

            >The whole machine with three displays uses about 500W on average usage
            That's half my AC usage wtf.
            Are you a neet that doesn't have to care about the bills?

          • 2 years ago
            Anonymous

            Power is pretty cheap here. That figure is based of the UPS that has more than just that PC plugged in.
            This is 500W on 110V to be clear.

        • 2 years ago
          Anonymous

          Just wastes a lot of power for no reason but okay.

          That and it makes the 1080Ti mostly useless, talk about mismatching

  12. 2 years ago
    Anonymous

    >people are still choosing dual channel

    • 2 years ago
      Anonymous

      >windows 10 1607
      why?

    • 2 years ago
      Anonymous

      ddr5 is quad channel with just 2 sticks :^)

  13. 2 years ago
    Anonymous

    Won't matter once adding more cache like 3d v-cache catches on.

  14. 2 years ago
    Anonymous

    waow!! gotta consoom!!

  15. 2 years ago
    Anonymous

    I’m still using ddr3

  16. 2 years ago
    Anonymous

    In most games from that video it was actually showing worse performance with DDR5 lmao

  17. 2 years ago
    Anonymous

    i only play aoe II and dota 2 anyway

  18. 2 years ago
    Anonymous

    COOMSOOM

  19. 2 years ago
    Anonymous

    DDR42133
    vs
    DDR54800

  20. 2 years ago
    Anonymous

    I don't understand what the difference is.

  21. 2 years ago
    Anonymous

    let me guess - this is a one game where DDR5 actually outperforms a DDR4 and the price difference is almost doubled

  22. 2 years ago
    Anonymous

    Yes. Gonna keep using it for the next several years in w mini pc i just ordered. So it wont even be full size desktop ddr4. Itll be laptop 3200 ddr4. Lol

  23. 2 years ago
    Anonymous

    yeah and have 30ms+ e2e latency, no thanks you can eat dick yourself

  24. 2 years ago
    Anonymous

    On a processor that supports DDR5, sure.
    You're handicapping your CPU at that rate.

  25. 2 years ago
    Anonymous

    8G DDR5 is trash too. Proper 16G sticks would be better.

  26. 2 years ago
    Anonymous

    >70$ for a low tier 8gb stick
    no thanks ill just stick with AM4 and upgrade the cpu in a year and then wait until ddr5 gets cheap

  27. 2 years ago
    Anonymous

    >no speeds or latency
    Pointless

    • 2 years ago
      Anonymous

      You realize that's in the video beginning?

      • 2 years ago
        Anonymous

        Post them then. Making people sift through cancer is poor form.

      • 2 years ago
        Anonymous

        What video? You posted a picture with no source.

  28. 2 years ago
    Anonymous

    show 16GB DDR4 Black person, I know that's a "how does 8GB do in 2022" Linus video.

  29. 2 years ago
    Anonymous

    3600-16-16-16 here.
    I have no idea what you Black folk are talking about, who cares about bandwidth when your memory latency is shit?

  30. 2 years ago
    Anonymous

    just consoom so you can play new marvel games at 2% faster
    you're a transphobe if you don't

  31. 2 years ago
    Anonymous

    8gb ddr2 reporting in

  32. 2 years ago
    Anonymous

    I think 6600-7000mt/s speeds are fine for DDR5. You won't need 8000mt/s because 3D cache CPUs have shown us that they don't care much about RAM speed. One less reason to be a waitcuck.

  33. 2 years ago
    Anonymous

    >20% more frames for 100% more money

    • 2 years ago
      Anonymous

      It appears to give more frames at 4K, which is worth anything.

  34. 2 years ago
    Anonymous

    Imagine getting just a handful of frames more on fricking meme 4k bullshit that no sane person uses and thinking your pointless RAM upgrade is great because of it.

  35. 2 years ago
    Anonymous

    >DDR5 makes for more faster vidya gaems!*
    >(*in some games, particularly modern ones)
    Besides, upgrading from 4 to 5 for those sick FPS is only a concern for people who already use a top-of-the-line, expensive frickoff big GPU.

  36. 2 years ago
    Anonymous

    I don't know what game that is but it's probably shit

  37. 2 years ago
    Anonymous

    All of that sounds like trash RAM
    especially 8GB DDR5 sticks, those things are all fricking shite that is actually worse than any decent DDR4.

    • 2 years ago
      Anonymous

      Early DDR kits are always pipecleaners that fabs use to recoup costs as they start binning better chips. I don't know why this is such a point of contention every single time a new DDR iteration releases. Its like you shit for brained subhuman apes never learn a single thing in your lives.
      >hurrr DDR3 1333 is shit!
      >my stupid high binned leet OC DDR2 is better!
      >hurr DDR4 2400 is shit!
      >my stupid high binned leet OC DDR3 is better!
      >hurr DDR5 5000 is shit!
      >my stupid high binned leet OC DDR4 is better!

      • 2 years ago
        Anonymous

        You're not wrong, even current high tier DDR5 kits are shit in many case compared to even an half assed Bdie tune, what I mean is 8GB DDR5 is extra fricking shit even compared to that it is absolutely not worth spending money on in any capacity.

      • 2 years ago
        Anonymous

        Alder Lake is holding back DDR5 with its dual memory controllers. The tech is going to get a lot better after Zen 4.

        • 2 years ago
          Anonymous

          Thats not how any of this works, brainlet. The IMC in Alder Lake has literally nothing to do with how DDR5 is binned.

          • 2 years ago
            Anonymous

            Hynix M-die ddr5 has unrealized headroom dum-dum.
            They're topping out at 6800-7200 now because ADL's IMC isn't super great, and z690 board traces are atrocious.

  38. 2 years ago
    Anonymous

    imagine wasting your time playing vidya and not ricing window managers.

  39. 2 years ago
    Anonymous

    Anything above 30FPS hurts my eyes.

  40. 2 years ago
    Anonymous

    >no speeds or latency listed
    >single channel samea s dual channel

    yeah great graph!

  41. 2 years ago
    Anonymous

    i already had 16gb of ddr4 and recently doubled that. my choice was:
    >discard my existing ram, buy 32 gb of ddr5
    or
    >buy 16gb of cheap ddr4
    it was an easy decision

  42. 2 years ago
    Anonymous

    How much did Corsair pay you?

  43. 2 years ago
    Anonymous

    Why would I? The life expectancy of my AM4 board in regards to the CPUs I can put in is pretty fantastic, and since it only supports DDR4 I'm sticking with it. Currently very comfy with 32GB, and this PC has never seen a single 3D graphics application, so I'm golden.

  44. 2 years ago
    Anonymous

    First batch of DDR5 is dogshit.
    DDR4 CL18 4400Mhz is going for cheap these days. Can even overclock (or underclock) a little and it's faster than DDR4 for gaymes in 98% of titles.
    Overclock being CL15 4133Mhz, 1T

    • 2 years ago
      Anonymous

      this. when DDR4 released you were moronic to try and adopt it in the first year or two of public availability. it's the same now

      • 2 years ago
        Anonymous

        Stop coping. Too many people in this thread are butthurt DDR4 owners playing down DDR5. Fact is the 3D cache CPUs don't care much about RAM speed. This means you can start buying DDR5 as soon as new CPU models come out. Waitcucking has never been smart, it's how poorgays try to save a nickel and a dime.

        • 2 years ago
          Anonymous

          >as soon as new CPU models come out
          I'm going to buy DDR5 when there is a need for a new PC. And I can not see that happening in this decade.

          • 2 years ago
            Anonymous

            A lot of people come to these threads to justify their recent purchases. Just accept that there is always better stuff right around the corner.

          • 2 years ago
            Anonymous

            It's not even recent. My current PC is 6 years old. The days when you couldn't use a decade old PC at all are over. I don't give one shit for better stuff. I run everything I own into the ground and fix it with duct tape twice before buying anything new (although even then I'll most likely buy used).

        • 2 years ago
          Anonymous

          >This means you can start buying DDR5 as soon as new CPU models come out.
          The new CPU models with 3D cache won't be out until 2023.

        • 2 years ago
          Anonymous

          No, DDR5 in its current state is actually slower in many cases due to have such a high latency. The modules also run very hot.

  45. 2 years ago
    Anonymous

    >$400 for 32GB
    dios mio

  46. 2 years ago
    Anonymous

    I'm fine with DDR2, thank you very much.

    • 2 years ago
      Anonymous

      Good, I'm also using DDR2 and DDR3

      • 2 years ago
        Anonymous

        similar here. I'm currently on 2nd gen Core i on all of my machines. There is nothing I'm aware of that would warrant paying money for something new for as long as these machines aren't outright broken.

  47. 2 years ago
    Anonymous

    I'm still gpu bottlenecked even with a 5900x x570 32gb cl16 dual rank 3200mhz 3440x1440p 144hz and 120hz vr 3k and 4k 60hz tv with a 6900xt faster ram won't help shit unless I go to 240hz which is pointless until vr rt and tvs run at that

    • 2 years ago
      Anonymous

      well If you go 4k exclusively you should have gone with a 3090/3090 Ti

      • 2 years ago
        Anonymous

        >well If you go 4k exclusively you should have gone with a 3090/3090 Ti
        3k is all alvr and other streaming tech on quest 2 and others can handle before it lags maybe pimax r12k and valve deckhard (index 2) quest 3 will run 4k and 240hz uw native and I can justify getting a 8900xt next year or two

      • 2 years ago
        Anonymous

        4K ultra always needs the top card. Maybe RTX 5080 changes this, but if you look at OP's game it dips below 60 on ultra.

    • 2 years ago
      Anonymous

      I ain't parsing all those numbers lol

  48. 2 years ago
    Anonymous

    Not posting the source should default to a month long ban.

  49. 2 years ago
    Anonymous

    I have 4x32GB DDR4. Why would I upgrade?
    There aren't any 64GB DDR5 DIMMs yet.

  50. 2 years ago
    Anonymous

    How does this affect me rendering stuff in Eevee?

  51. 2 years ago
    Anonymous

    >Guys, look! We improved DDR4 speeds by 20%!
    >By doubling the fricking latency
    Bravo.

  52. 2 years ago
    Anonymous

    8gb ddr3 works fine for me for the last 10 years.

  53. 2 years ago
    Anonymous

    Imagine buying Intel CPUs.
    For the extra cost of DDR5, newest Intel CPU and new mainboard i could just get a way better GPU.

  54. 2 years ago
    Anonymous

    5800x3d is faster

  55. 2 years ago
    Anonymous

    >2x8GB DDR5 is slower than 1x8GB DDR5
    wtf

  56. 2 years ago
    Anonymous

    Somebody explain this to me because I'm kind of a brainlet when it comes to some of the technical details on RAM and memory.

    Obviously, DDR5 will take a while to mature and reach the full potential that'll surpass DDR4, just like what happened with every RAM cycle. But, theoretically, if I bought a DDR5 motherboard and Alder lake CPU now, could I easily upgrade to that proper, mature DDR5 RAM when it's gotten good in a year or so, or are there hardware level differences in the motherboards/CPU which would kneecap it and I should just wait for the next gen CPUs?

    • 2 years ago
      Anonymous

      >or are there hardware level differences in the motherboards/CPU which would kneecap it and I should just wait for the next gen CPUs?

      Yes. It's possible that some mobo (specially low end) might not support all RAM sticks Capacity or combinations. It is possible that those problems are iron out with BIOS updates however that depends on the manufacturer.

      If you want more info, google Ryzen 1st gen memory problems. It didn't support all speeds and was very picky with the timing

    • 2 years ago
      Anonymous

      >are there hardware level differences in the motherboards/CPU which would kneecap it and I should just wait for the next gen CPUs?
      The memory controllers are also going to get better, yes. I don't think anyone ever made a fuss about Intel's memory controller when DDR4 first came out, but first gen Ryzen has pretty bad memory controller (and of course, at the same time Zen design benefits disproportionately from fast RAM) and it can only reliably hit around 3000MHz.

  57. 2 years ago
    Anonymous

    Post the rest where there's no difference :^)

  58. 2 years ago
    Jesus

    Imagine playing videogames past the age of 12...

    • 2 years ago
      Anonymous

      have a nice day

      • 2 years ago
        Jesus

        Imagine being so childish as to care what other people do in their free time

        KeKw

    • 2 years ago
      Anonymous

      Imagine being so childish as to care what other people do in their free time

  59. 2 years ago
    sentientCode

    This is a strange place

Your email address will not be published. Required fields are marked *