Lower latency, with tearing you can receive information on at least part of the screen much more quickly than your monitor's refresh rate would allow with vsync
See: CSGO running at many multiples higher fps than most monitors, go ahead and try moving your mouse around with a 200fps vs 400fps cap in console
every frame the game loop checks the user input so the higher the FPS the faster your inputs are being accounted for.
At 60 FPS it takes at ~16.6 ms for your input to take effect but at 200 FPS it would only take 5ms
other than in competitive games it's a waste of processing power to render out all those frames you can't see
GPU usage also comes into account as well for input lag. Usually around 300-400 fps is the sweet spot for those games unless it runs at 800 fps or higher, stable.
even during the world wars neither European or East Asian society ever collapsed. The storm may be coming, but it will not be as great as you believe it to be.
>no sub-timings or even frequency shown
It's still going to be a while before DDR5 is able to outperform high end DDR4 kits. In the end this dick measuring contest only applies to people using high end graphics cards that consume as much power as a microwave oven.
Same GPU, the CPU+memory can still be a bottleneck even when they aren't fully utilized, there's more to load than just having 100% used. Same applied to GPUs also.
Everything is just running more efficiently, so you get to push your GPU even more.
you can barely discern the differences after 80fps so lol, enjoy being a homosexual
COOMSOOM
You people are so fricking annoying. What’s the point of posting on a board about technology when you only b***h and complain about new things like an old man yelling at a cloud?
You’re annoying as frick dude and your newbie is showing. The consoom meme is about people who replace their personality with things they buy. Like marvel fans that fill their houses with funko pops. Instead of having a hobby, consoooomers watch and buy things. This thread is about new technology. On the board for technology. Technology people use for work, for creative projects, and for gaming. I’m assuming you’re a poor teenage newbie NEET who is new here so you just throw out words to try and fit in, but that’s not what the meme means. Try to keep up and maybe you’ll look like less of a fricking fool.
Do you expect the world to just stay on DDR3 forever? Do you expect technology to just stagnate? You’re a fricking idiot, have a nice day and get off this board >NOOO YOU CANT JUST IMPROVE THINGS AND MAKE NEW INNOVATIONS IN TECHNOLOGY!!! >IT REMINDS ME THAT IM JUST A SAD POORgay WHO CANT AFFORD IT ANYWAY!!! >STOP MAKING ME FEEL BAD ABOUT MY PHENOM 2 AND GTX 760 STOP IT STOP IT!!!!!!!!!
2 years ago
Anonymous
ywnbaw
2 years ago
Anonymous
take your meds
If you’re this annoying on an anonymous imageboard I can’t imagine how much of an unlikable gay you are in real life
His point is that tech isn't improving, it's just being revised and tweaked. Going from DDR4 to DDR5 will not be noticeable for anyone. If you need benchmarks to reveal a difference that you cannot otherwise perceive, the difference is insignificant.
For example, even the densest folks will notice the difference in going from a HDD to SSD for your programs drive. Sub 5 sec boots, most programs starting in under 3 sec.
Yes, it’s call incremental improvements over time. DDR5 doesn’t exist for DDR4 customers, it exists for DDR3 customers that are finally upgrading. Just like how new phones only have minor improvements each year. You’re not expected to upgrade every generation, because the improvements add up over time
For some reason I was looking at old anandtech articles when ddr memory first became available, and the exact same buyer advice applies today as then. First batch will be low speed, low density; overpriced, and no platforms will be seriously bottlenecked by the cheaper alternative for at least 1.5 years. We saw this with ddr3-800 and ddr4-2133.
Back then there was also a serious issue of lack of motherboards with ddr support and it wasn't until both Intel and AMD had widely available ddr boards that prices became reasonable.
I didn't really look at the actual numbers (I also have never heard of Wonderlands before), I just figured that RAM speed would have the greatest impact on iGPUs. I am surprised to see such a big difference on a dGPU. I wonder if there is a similar difference across other applications/games or if this one just heavily favors ram speed.
>X5690
God I hope you're not actually using that for anything serious
I had a dual X5650 poweredge and my 1700X would run circles around it while consuming significantly less power, and that was already many years ago now
Just wastes a lot of power for no reason but okay.
2 years ago
Anonymous
The whole machine with three displays uses about 500W on average usage. Down to 300W when idle and around 600W with both the CPU and GPU loaded up. I'm not concerned with it since the machine still works fine. Maybe in a couple years once chip costs come down I'll build a new box but there is no need right now.
[...]
That and it makes the 1080Ti mostly useless, talk about mismatching
The PCIe 2 is more of a detriment than the X5690 is. Both cuda tasks and gaymes do just fine on it. I don't play anything new enough to need more.
2 years ago
Anonymous
I'm also on DDR3. Also, power optimization on PC is moronic for 99% of users when you don't reduce electricity bills from HVAC or water heating first, or you're a NEET who doesn't even pay for the electric bill.
2 years ago
Anonymous
The power usage a PC uses compared to a fridge, freezer, water heater, or HVAC as you mentioned is negligible. I'm not worried about it being inefficient at this point. The hardware has lasted a decade without issues and still does what I want. Maybe in a couple years I'll build a new machine but at this point, I don't need it.
The Z400 stays off most of the day and is only on for a couple hours a day after work. Weekends it's on for most of the day but that's only two days a week.
My M900 tiny shitpost box is on 24/7 since it uses so little power especially idle it's cold to the touch.
2 years ago
Anonymous
>The whole machine with three displays uses about 500W on average usage
That's half my AC usage wtf.
Are you a neet that doesn't have to care about the bills?
2 years ago
Anonymous
Power is pretty cheap here. That figure is based of the UPS that has more than just that PC plugged in.
This is 500W on 110V to be clear.
Yes. Gonna keep using it for the next several years in w mini pc i just ordered. So it wont even be full size desktop ddr4. Itll be laptop 3200 ddr4. Lol
I think 6600-7000mt/s speeds are fine for DDR5. You won't need 8000mt/s because 3D cache CPUs have shown us that they don't care much about RAM speed. One less reason to be a waitcuck.
Imagine getting just a handful of frames more on fricking meme 4k bullshit that no sane person uses and thinking your pointless RAM upgrade is great because of it.
>DDR5 makes for more faster vidya gaems!* >(*in some games, particularly modern ones)
Besides, upgrading from 4 to 5 for those sick FPS is only a concern for people who already use a top-of-the-line, expensive frickoff big GPU.
Early DDR kits are always pipecleaners that fabs use to recoup costs as they start binning better chips. I don't know why this is such a point of contention every single time a new DDR iteration releases. Its like you shit for brained subhuman apes never learn a single thing in your lives. >hurrr DDR3 1333 is shit! >my stupid high binned leet OC DDR2 is better! >hurr DDR4 2400 is shit! >my stupid high binned leet OC DDR3 is better! >hurr DDR5 5000 is shit! >my stupid high binned leet OC DDR4 is better!
You're not wrong, even current high tier DDR5 kits are shit in many case compared to even an half assed Bdie tune, what I mean is 8GB DDR5 is extra fricking shit even compared to that it is absolutely not worth spending money on in any capacity.
Thats not how any of this works, brainlet. The IMC in Alder Lake has literally nothing to do with how DDR5 is binned.
2 years ago
Anonymous
Hynix M-die ddr5 has unrealized headroom dum-dum.
They're topping out at 6800-7200 now because ADL's IMC isn't super great, and z690 board traces are atrocious.
i already had 16gb of ddr4 and recently doubled that. my choice was: >discard my existing ram, buy 32 gb of ddr5
or >buy 16gb of cheap ddr4
it was an easy decision
Why would I? The life expectancy of my AM4 board in regards to the CPUs I can put in is pretty fantastic, and since it only supports DDR4 I'm sticking with it. Currently very comfy with 32GB, and this PC has never seen a single 3D graphics application, so I'm golden.
First batch of DDR5 is dogshit.
DDR4 CL18 4400Mhz is going for cheap these days. Can even overclock (or underclock) a little and it's faster than DDR4 for gaymes in 98% of titles.
Overclock being CL15 4133Mhz, 1T
Stop coping. Too many people in this thread are butthurt DDR4 owners playing down DDR5. Fact is the 3D cache CPUs don't care much about RAM speed. This means you can start buying DDR5 as soon as new CPU models come out. Waitcucking has never been smart, it's how poorgays try to save a nickel and a dime.
>as soon as new CPU models come out
I'm going to buy DDR5 when there is a need for a new PC. And I can not see that happening in this decade.
2 years ago
Anonymous
A lot of people come to these threads to justify their recent purchases. Just accept that there is always better stuff right around the corner.
2 years ago
Anonymous
It's not even recent. My current PC is 6 years old. The days when you couldn't use a decade old PC at all are over. I don't give one shit for better stuff. I run everything I own into the ground and fix it with duct tape twice before buying anything new (although even then I'll most likely buy used).
similar here. I'm currently on 2nd gen Core i on all of my machines. There is nothing I'm aware of that would warrant paying money for something new for as long as these machines aren't outright broken.
I'm still gpu bottlenecked even with a 5900x x570 32gb cl16 dual rank 3200mhz 3440x1440p 144hz and 120hz vr 3k and 4k 60hz tv with a 6900xt faster ram won't help shit unless I go to 240hz which is pointless until vr rt and tvs run at that
>well If you go 4k exclusively you should have gone with a 3090/3090 Ti
3k is all alvr and other streaming tech on quest 2 and others can handle before it lags maybe pimax r12k and valve deckhard (index 2) quest 3 will run 4k and 240hz uw native and I can justify getting a 8900xt next year or two
Somebody explain this to me because I'm kind of a brainlet when it comes to some of the technical details on RAM and memory.
Obviously, DDR5 will take a while to mature and reach the full potential that'll surpass DDR4, just like what happened with every RAM cycle. But, theoretically, if I bought a DDR5 motherboard and Alder lake CPU now, could I easily upgrade to that proper, mature DDR5 RAM when it's gotten good in a year or so, or are there hardware level differences in the motherboards/CPU which would kneecap it and I should just wait for the next gen CPUs?
>or are there hardware level differences in the motherboards/CPU which would kneecap it and I should just wait for the next gen CPUs?
Yes. It's possible that some mobo (specially low end) might not support all RAM sticks Capacity or combinations. It is possible that those problems are iron out with BIOS updates however that depends on the manufacturer.
If you want more info, google Ryzen 1st gen memory problems. It didn't support all speeds and was very picky with the timing
>are there hardware level differences in the motherboards/CPU which would kneecap it and I should just wait for the next gen CPUs?
The memory controllers are also going to get better, yes. I don't think anyone ever made a fuss about Intel's memory controller when DDR4 first came out, but first gen Ryzen has pretty bad memory controller (and of course, at the same time Zen design benefits disproportionately from fast RAM) and it can only reliably hit around 3000MHz.
whats the point of FPS that exceeds your monitors' refresh?
Who the frick uses a monitor that's at least 120Hz these days?
Most people.
i forgot but with gsync you wont get tearing and it if runs above you get lower latency or input sorry vague memory
Lower latency, with tearing you can receive information on at least part of the screen much more quickly than your monitor's refresh rate would allow with vsync
See: CSGO running at many multiples higher fps than most monitors, go ahead and try moving your mouse around with a 200fps vs 400fps cap in console
What's the point in more than 16 colors?
Latency.
I know a dude playing csgo in 600 fps on a 120hz monitor.
every frame the game loop checks the user input so the higher the FPS the faster your inputs are being accounted for.
At 60 FPS it takes at ~16.6 ms for your input to take effect but at 200 FPS it would only take 5ms
other than in competitive games it's a waste of processing power to render out all those frames you can't see
GPU usage also comes into account as well for input lag. Usually around 300-400 fps is the sweet spot for those games unless it runs at 800 fps or higher, stable.
https://blurbusters.com/faq/benefits-of-frame-rate-above-refresh-rate/
Placebo effect
headroom in case of frame drops. but if anything in OP‘s graphic is higher than your refresh rate, you‘re doing shit wrong
Imagine wasting your money on sticks of memory right before a societal collapse just for a 2% increase in speed.
I welcome the collapse. Then I'll just go and take those things.
I live on a self sufficient LDS compound that I built with my 7 brothers, our 16 wives, and our 47 children. We are going to be OK. good luck Gentile!
>2%
Failed maths?
+20% which isn’t enough you fricking clown
As long as there has been society, people have been predicting its immanent collapse. I'm not too worried.
even during the world wars neither European or East Asian society ever collapsed. The storm may be coming, but it will not be as great as you believe it to be.
I'm on DDR3 and am fine
same here.
4gb ddr3, ati radeon hd5850, i5 2.6, 1tb hdd
>same here.
>4gb ddr3, ati radeon hd5850, i5 2.6, 1tb hdd
No ssd?
you really should get a ssd
>no sub-timings or even frequency shown
It's still going to be a while before DDR5 is able to outperform high end DDR4 kits. In the end this dick measuring contest only applies to people using high end graphics cards that consume as much power as a microwave oven.
Those are all high end kits.
FreeSync/G-Sync and VRR in general. Plus anon totally ignored 1% numbers, even if they only have a 60 or 75Hz monitor from 12 years ago.
They're not the 5,000MHz+ kits that have been coming out recently, are they?
https://www.pcgamesn.com/fastest-ddr4-ram
>Intel only sticks
Why does this even matter?
...there are no amd ddr5 platforms yet.
Anon, we're talking about DDR4.
Well some are still stuck on ddr3 (cuz we stupid asf)
Wait, so DDR5 pushes frame rates at 4K independent of the GPU? I thought only graphics card was the bottleneck.
Same GPU, the CPU+memory can still be a bottleneck even when they aren't fully utilized, there's more to load than just having 100% used. Same applied to GPUs also.
Everything is just running more efficiently, so you get to push your GPU even more.
This is a 8GB test though. DDR5 speeds could alleviate the low RAM by being able to cycle data faster.
I think anons point is that Dual channel 2x8GB DDR4 gets beaten by Dual Channel 2x8GB DDR5 even at 4k
>look, we're 20% better at a function of memory which is never the bottleneck, consume more overpriced shit
Wrong board
You people are so fricking annoying. What’s the point of posting on a board about technology when you only b***h and complain about new things like an old man yelling at a cloud?
>no do not think
>just consoom
>praise consoomers!
You’re annoying as frick dude and your newbie is showing. The consoom meme is about people who replace their personality with things they buy. Like marvel fans that fill their houses with funko pops. Instead of having a hobby, consoooomers watch and buy things. This thread is about new technology. On the board for technology. Technology people use for work, for creative projects, and for gaming. I’m assuming you’re a poor teenage newbie NEET who is new here so you just throw out words to try and fit in, but that’s not what the meme means. Try to keep up and maybe you’ll look like less of a fricking fool.
take your meds
Do you expect the world to just stay on DDR3 forever? Do you expect technology to just stagnate? You’re a fricking idiot, have a nice day and get off this board
>NOOO YOU CANT JUST IMPROVE THINGS AND MAKE NEW INNOVATIONS IN TECHNOLOGY!!!
>IT REMINDS ME THAT IM JUST A SAD POORgay WHO CANT AFFORD IT ANYWAY!!!
>STOP MAKING ME FEEL BAD ABOUT MY PHENOM 2 AND GTX 760 STOP IT STOP IT!!!!!!!!!
ywnbaw
If you’re this annoying on an anonymous imageboard I can’t imagine how much of an unlikable gay you are in real life
>only b***h and complain about new things
DDR1
DDR2
DDR3
DDR4
DDR5 <----- This is new?
His point is that tech isn't improving, it's just being revised and tweaked. Going from DDR4 to DDR5 will not be noticeable for anyone. If you need benchmarks to reveal a difference that you cannot otherwise perceive, the difference is insignificant.
For example, even the densest folks will notice the difference in going from a HDD to SSD for your programs drive. Sub 5 sec boots, most programs starting in under 3 sec.
Going from SSD to NVME...not so much.
Yes, it’s call incremental improvements over time. DDR5 doesn’t exist for DDR4 customers, it exists for DDR3 customers that are finally upgrading. Just like how new phones only have minor improvements each year. You’re not expected to upgrade every generation, because the improvements add up over time
>+20fps average gain
>insignificant
It's faster in some, slower in others and costs twice as much. That's less than insignificant, that's terrible.
Well said anon, I blame the linus youtube crowd myself for the sorry state the pc scene is in now.
For some reason I was looking at old anandtech articles when ddr memory first became available, and the exact same buyer advice applies today as then. First batch will be low speed, low density; overpriced, and no platforms will be seriously bottlenecked by the cheaper alternative for at least 1.5 years. We saw this with ddr3-800 and ddr4-2133.
Back then there was also a serious issue of lack of motherboards with ddr support and it wasn't until both Intel and AMD had widely available ddr boards that prices became reasonable.
Suck my ass, shill.
>no speed
>no timings
>not a single dual-rank kit
Imagine being this much of a moronic street shitter.
Are these with an integrated GPU?
>any iGPU running a recent game at 4k Ultra at 80 FPS
if that was true, nobody would give a shit about dGPUs
I didn't really look at the actual numbers (I also have never heard of Wonderlands before), I just figured that RAM speed would have the greatest impact on iGPUs. I am surprised to see such a big difference on a dGPU. I wonder if there is a similar difference across other applications/games or if this one just heavily favors ram speed.
you can barely discern the differences after 80fps so lol, enjoy being a homosexual
Pretty sad to think that. You really never experience 144Hz or more? Dang.
I thought even poorgays these days do at least 144Hz + FPS.
I'm still using DDR3 just fine anon.
>X5690
God I hope you're not actually using that for anything serious
I had a dual X5650 poweredge and my 1700X would run circles around it while consuming significantly less power, and that was already many years ago now
I use it for all my Windows needs. It does just fine for everything I've needed of it.
Just wastes a lot of power for no reason but okay.
The whole machine with three displays uses about 500W on average usage. Down to 300W when idle and around 600W with both the CPU and GPU loaded up. I'm not concerned with it since the machine still works fine. Maybe in a couple years once chip costs come down I'll build a new box but there is no need right now.
The PCIe 2 is more of a detriment than the X5690 is. Both cuda tasks and gaymes do just fine on it. I don't play anything new enough to need more.
I'm also on DDR3. Also, power optimization on PC is moronic for 99% of users when you don't reduce electricity bills from HVAC or water heating first, or you're a NEET who doesn't even pay for the electric bill.
The power usage a PC uses compared to a fridge, freezer, water heater, or HVAC as you mentioned is negligible. I'm not worried about it being inefficient at this point. The hardware has lasted a decade without issues and still does what I want. Maybe in a couple years I'll build a new machine but at this point, I don't need it.
The Z400 stays off most of the day and is only on for a couple hours a day after work. Weekends it's on for most of the day but that's only two days a week.
My M900 tiny shitpost box is on 24/7 since it uses so little power especially idle it's cold to the touch.
>The whole machine with three displays uses about 500W on average usage
That's half my AC usage wtf.
Are you a neet that doesn't have to care about the bills?
Power is pretty cheap here. That figure is based of the UPS that has more than just that PC plugged in.
This is 500W on 110V to be clear.
That and it makes the 1080Ti mostly useless, talk about mismatching
>people are still choosing dual channel
>windows 10 1607
why?
ddr5 is quad channel with just 2 sticks :^)
Won't matter once adding more cache like 3d v-cache catches on.
waow!! gotta consoom!!
I’m still using ddr3
In most games from that video it was actually showing worse performance with DDR5 lmao
i only play aoe II and dota 2 anyway
COOMSOOM
DDR42133
vs
DDR54800
I don't understand what the difference is.
let me guess - this is a one game where DDR5 actually outperforms a DDR4 and the price difference is almost doubled
Yes. Gonna keep using it for the next several years in w mini pc i just ordered. So it wont even be full size desktop ddr4. Itll be laptop 3200 ddr4. Lol
yeah and have 30ms+ e2e latency, no thanks you can eat dick yourself
On a processor that supports DDR5, sure.
You're handicapping your CPU at that rate.
8G DDR5 is trash too. Proper 16G sticks would be better.
>70$ for a low tier 8gb stick
no thanks ill just stick with AM4 and upgrade the cpu in a year and then wait until ddr5 gets cheap
>no speeds or latency
Pointless
You realize that's in the video beginning?
Post them then. Making people sift through cancer is poor form.
What video? You posted a picture with no source.
show 16GB DDR4 Black person, I know that's a "how does 8GB do in 2022" Linus video.
3600-16-16-16 here.
I have no idea what you Black folk are talking about, who cares about bandwidth when your memory latency is shit?
just consoom so you can play new marvel games at 2% faster
you're a transphobe if you don't
8gb ddr2 reporting in
I think 6600-7000mt/s speeds are fine for DDR5. You won't need 8000mt/s because 3D cache CPUs have shown us that they don't care much about RAM speed. One less reason to be a waitcuck.
>20% more frames for 100% more money
It appears to give more frames at 4K, which is worth anything.
Imagine getting just a handful of frames more on fricking meme 4k bullshit that no sane person uses and thinking your pointless RAM upgrade is great because of it.
>DDR5 makes for more faster vidya gaems!*
>(*in some games, particularly modern ones)
Besides, upgrading from 4 to 5 for those sick FPS is only a concern for people who already use a top-of-the-line, expensive frickoff big GPU.
I don't know what game that is but it's probably shit
All of that sounds like trash RAM
especially 8GB DDR5 sticks, those things are all fricking shite that is actually worse than any decent DDR4.
Early DDR kits are always pipecleaners that fabs use to recoup costs as they start binning better chips. I don't know why this is such a point of contention every single time a new DDR iteration releases. Its like you shit for brained subhuman apes never learn a single thing in your lives.
>hurrr DDR3 1333 is shit!
>my stupid high binned leet OC DDR2 is better!
>hurr DDR4 2400 is shit!
>my stupid high binned leet OC DDR3 is better!
>hurr DDR5 5000 is shit!
>my stupid high binned leet OC DDR4 is better!
You're not wrong, even current high tier DDR5 kits are shit in many case compared to even an half assed Bdie tune, what I mean is 8GB DDR5 is extra fricking shit even compared to that it is absolutely not worth spending money on in any capacity.
Alder Lake is holding back DDR5 with its dual memory controllers. The tech is going to get a lot better after Zen 4.
Thats not how any of this works, brainlet. The IMC in Alder Lake has literally nothing to do with how DDR5 is binned.
Hynix M-die ddr5 has unrealized headroom dum-dum.
They're topping out at 6800-7200 now because ADL's IMC isn't super great, and z690 board traces are atrocious.
imagine wasting your time playing vidya and not ricing window managers.
Anything above 30FPS hurts my eyes.
>no speeds or latency listed
>single channel samea s dual channel
yeah great graph!
i already had 16gb of ddr4 and recently doubled that. my choice was:
>discard my existing ram, buy 32 gb of ddr5
or
>buy 16gb of cheap ddr4
it was an easy decision
How much did Corsair pay you?
Why would I? The life expectancy of my AM4 board in regards to the CPUs I can put in is pretty fantastic, and since it only supports DDR4 I'm sticking with it. Currently very comfy with 32GB, and this PC has never seen a single 3D graphics application, so I'm golden.
First batch of DDR5 is dogshit.
DDR4 CL18 4400Mhz is going for cheap these days. Can even overclock (or underclock) a little and it's faster than DDR4 for gaymes in 98% of titles.
Overclock being CL15 4133Mhz, 1T
this. when DDR4 released you were moronic to try and adopt it in the first year or two of public availability. it's the same now
Stop coping. Too many people in this thread are butthurt DDR4 owners playing down DDR5. Fact is the 3D cache CPUs don't care much about RAM speed. This means you can start buying DDR5 as soon as new CPU models come out. Waitcucking has never been smart, it's how poorgays try to save a nickel and a dime.
>as soon as new CPU models come out
I'm going to buy DDR5 when there is a need for a new PC. And I can not see that happening in this decade.
A lot of people come to these threads to justify their recent purchases. Just accept that there is always better stuff right around the corner.
It's not even recent. My current PC is 6 years old. The days when you couldn't use a decade old PC at all are over. I don't give one shit for better stuff. I run everything I own into the ground and fix it with duct tape twice before buying anything new (although even then I'll most likely buy used).
>This means you can start buying DDR5 as soon as new CPU models come out.
The new CPU models with 3D cache won't be out until 2023.
No, DDR5 in its current state is actually slower in many cases due to have such a high latency. The modules also run very hot.
>$400 for 32GB
dios mio
I'm fine with DDR2, thank you very much.
Good, I'm also using DDR2 and DDR3
similar here. I'm currently on 2nd gen Core i on all of my machines. There is nothing I'm aware of that would warrant paying money for something new for as long as these machines aren't outright broken.
I'm still gpu bottlenecked even with a 5900x x570 32gb cl16 dual rank 3200mhz 3440x1440p 144hz and 120hz vr 3k and 4k 60hz tv with a 6900xt faster ram won't help shit unless I go to 240hz which is pointless until vr rt and tvs run at that
well If you go 4k exclusively you should have gone with a 3090/3090 Ti
>well If you go 4k exclusively you should have gone with a 3090/3090 Ti
3k is all alvr and other streaming tech on quest 2 and others can handle before it lags maybe pimax r12k and valve deckhard (index 2) quest 3 will run 4k and 240hz uw native and I can justify getting a 8900xt next year or two
4K ultra always needs the top card. Maybe RTX 5080 changes this, but if you look at OP's game it dips below 60 on ultra.
I ain't parsing all those numbers lol
Not posting the source should default to a month long ban.
I have 4x32GB DDR4. Why would I upgrade?
There aren't any 64GB DDR5 DIMMs yet.
How does this affect me rendering stuff in Eevee?
>Guys, look! We improved DDR4 speeds by 20%!
>By doubling the fricking latency
Bravo.
8gb ddr3 works fine for me for the last 10 years.
Imagine buying Intel CPUs.
For the extra cost of DDR5, newest Intel CPU and new mainboard i could just get a way better GPU.
5800x3d is faster
>2x8GB DDR5 is slower than 1x8GB DDR5
wtf
Somebody explain this to me because I'm kind of a brainlet when it comes to some of the technical details on RAM and memory.
Obviously, DDR5 will take a while to mature and reach the full potential that'll surpass DDR4, just like what happened with every RAM cycle. But, theoretically, if I bought a DDR5 motherboard and Alder lake CPU now, could I easily upgrade to that proper, mature DDR5 RAM when it's gotten good in a year or so, or are there hardware level differences in the motherboards/CPU which would kneecap it and I should just wait for the next gen CPUs?
>or are there hardware level differences in the motherboards/CPU which would kneecap it and I should just wait for the next gen CPUs?
Yes. It's possible that some mobo (specially low end) might not support all RAM sticks Capacity or combinations. It is possible that those problems are iron out with BIOS updates however that depends on the manufacturer.
If you want more info, google Ryzen 1st gen memory problems. It didn't support all speeds and was very picky with the timing
>are there hardware level differences in the motherboards/CPU which would kneecap it and I should just wait for the next gen CPUs?
The memory controllers are also going to get better, yes. I don't think anyone ever made a fuss about Intel's memory controller when DDR4 first came out, but first gen Ryzen has pretty bad memory controller (and of course, at the same time Zen design benefits disproportionately from fast RAM) and it can only reliably hit around 3000MHz.
Post the rest where there's no difference :^)
Imagine playing videogames past the age of 12...
have a nice day
KeKw
Imagine being so childish as to care what other people do in their free time
This is a strange place