>E-cores
>the exact opposite of efficient
>weird uneven core/thread count
>3D cache
>89°C temp limit and its benefit diminishes with higher resolution gaming
>10nm Intel still outperforms 5nm TSMC in everything else
>Still no solution to automatically make use of multiple cores/threads
>Becky's M3 MacBook has similar computing power as top-of-the-line x86 desktop chips
>Both Intel and AMD huffing and puffing to keep up with ARM
>Meanwhile leather jacket man's 4090 is considered a national security threat and thus export restricted (despite having a gimped AD102 chip)
Where did it all go wrong?
It's All Fucked Shirt $22.14 |
Ape Out Shirt $21.68 |
It's All Fucked Shirt $22.14 |
ARM isn't a desktop architecture.
the Acorn Archimedes was made to be a cheap all-in-one workstation you stupid gay
Fine, where are the ARM and ARM2 desktop machines now
3D cache is actually useful but it needs to be made available for the whole CPU not just part of it like it currently is.
it's only useful for gaming and buying a 16 core for playing video games is stupid
It is if you buy a 7800x3D.
Wait for the 7950X3DXT
's M3 MacBook has similar computing power as top-of-the-line x86 desktop chips
>this is what iToddlers actually believe
the m3 max has an RTX 4090 equivalent GPU, what does the Intlel one have?
>the m3 max has an RTX 4090 equivalent GPU
delusional
4090 equivalent (or beats it potentially) in some workstation tasks only.
you dad beat you and your mom and thats why you grew up moronic
if your bullshit was true nobody would buy x86 and GPUs
yet nobody games on macbook laptops, it has ZERO cooling, those benchmarks are cool, but for how long does it keep this performance? no way it would be cool more than few minutes.. if so x86 would be dead, i would be having mac computer. yet nobody fricking wants it, non of my steam games would run on it, my Valve Index wont run on it. As i said ARM is for mobilephones and tablet like laptops(macbook) where you dont need performance, x86 is for desktop
MacBook pros have fans. the M3 max is not just on laptops
still a shit cpu with performance from 10 years ago
proof?
real time performance metrics put it around i3 8th generation. it's not very good, but because nobody on Mac really needs faster cou nobody gives a frick.
synthetic benchmarks favor single core performance and nothing else, a 2 core i3 outperforms a 64 core AMD threadripper
try harder like
who asked?
israelite
10 year old CPUs are fine for single threaded stuff. I'm writing this on a 10 year old celleron with a YouTube video playing in the background. Most of the CPU intensive tasks people care about these days are highly parallel provided they're not running brain dead software written by Indians.
x86 wouldnt be dead either way because it is the total opposite of what Apple ARM does.
It's modular and open. Not exactly Apples motivation to compete on an open space.
But you cant deny the performance, which for the great vast majority of people is more than enough.
AMD64 is great wish it actually had competition.
Maybe the utopian Risc-V will save us.
street shit harder
Now compile chromium on your toy CPU
As usual jeets have no argument
iOS and macos have the smallest market share in India, jeetbro.
>Becky's M3 MacBook has similar computing power as top-of-the-line x86 desktop chips
you read too much nonsense
at best it's up there with a contemporary i5 or r5
has as much credibility as a shit smear on toilet paper
>bar on SEO website bigger therefore gooder
itoodlers are something else
I dont know what those dudes are smoking but the M3 Max was already beaten in absolute performance and perforrmance/ watt and performance/ price when it came out
no solution to automatically make use of multiple cores/threads
APO
I can see a few issues with the E P approach.
1. A desktop is not a phone. A phone needs to operate all day even in very low power conditions. A desktop and a laptop simply hibernate when not used.
2. I am sure android is optimized for this approach. Not windows. Not Linux and especially all the windows and Linux apps.
3. In the end of the day if you want a bigger battery on a laptop just put in a bigger battery. The limit for airplanes is 100 Watt hours.
The e-core shit is so they can push the core count higher on marketing.
Desktops still have assloads of background processes, why waste high performance cores when you can offload them to E-Cores
Mark my words amd will be doing the big-little design in the next generation.
Good. I don't want discord/edge/random services taking up big cores.
Zen 4c cores already exist
You can already buy ryzen 8000 chips with big little
>its benefit diminishes with higher resolution gaming
Because you start waiting on the GPU.
>Becky's M3 MacBook has similar computing power as top-of-the-line x86 desktop chips from 12 years ago
ftfy
>ARM
enjoy your locked bootloader, homosexual
enjoy your 2 hours of battery life
battery?
>enjoy your locked bootloader
Apple engineers went out of their way to make the bootloader unlocked
These images are clearly reversed. Cpus try all ways they can to gain performance without raising prices while nvidia just doubles the cores for triple the price and lets the latest gpu meme cover for the lack of progress.
's M3 MacBook has similar computing power as top-of-the-line x86 desktop chips
dont think so buddy, that has NO cooling and ARM processors are for devices like mobile phone, not laptops. Show me what top tier games can it run similar at similar as my R9 7950x + radeon 7900xtx. YEAH THATS WHAT I THOUGHT PUSSY!!
ARM LAPTOPS ARE THE SAME AS MOBILE PHONES = SAME PERFORMANCE
M3 Single Thread performance is worse than i5-2500k LMAO
https://xmrig.com/benchmark?cpu=Apple+M3
https://xmrig.com/benchmark?cpu=Intel%28R%29+Core%28TM%29+i5-2500K+CPU+%40+3.30GHz
Imagine being worse than a 10 year old CPU
an i3 9100 has better single core performance than a threadripper
>weird uneven core/thread count
It's worse than that, on some mobile phone e-cores don't even support the same simd extension as their normal cores.
Just take a minute the appreciate the clusterfrick of having to check supported CPU simd extension at runtime instead of compile time.
Pure insanity.
>the exact opposite of efficient
this is a windows issue, not a CPU issue - windows doesn't effectively allocate the cores
>weird uneven core/thread count
this is an autism thing, it's just a number grow up
>3D cache
has legitimate uses and in many workloads vastly outperforms traditional caching methods
>89°C temp limit
how is this an issue? you don't need to touch your CPU
>10nm Intel still outperforms 5nm TSMC
intel doesn't messure in the same way as everyone else, these are actually very similarly sized transistors - this us also a point in favour of the CPUs you were just shitting on?
>Still no solution to automatically make use of multiple cores/threads
again, windows issue
>Becky's M3 MacBook has similar computing power as top-of-the-line x86 desktop chips
becky's macbook is heavily optimised for very specific, typically video render, workloads and are shit on in nearly every other scenario by nearly every other competing product
also this is mostly due to the GPU being on the same die and sharing the same memory, in high end servers there are x86 chips that do this and acheive similar levels of performance gains relative to their position - ergo this isn't a CPU development issue it's a "do you want it modular or do you want it fast" issue
>Both Intel and AMD huffing and puffing to keep up with ARM
ARM is still outperformed by top of the line x86 CPUs, it's only relative to power consumption that ARM holds the top spots - not really AMD or intel's fault as they can't just jump ship for ARM especially with such little software support
>Meanwhile leather jacket man's 4090 is considered a national security threat and thus export restricted
that's the US government being moronic, the 4090 isn't so stupidly powerful it dwarfs CPUs or anything, it's just the US not wanting China to have any way of gaining ground in the AI arms race - also, GPUs are a very, very different kind of processor so this isn't really relevant in a CPU discussion
>ergo this isn't a CPU development issue it's a "do you want it modular or do you want it fast" issue
to clarify, you can either have
>one big chip with GPU, CPU and memory on it
>it's really expensive
>replace the entire thing to upgrade any part of it
>it's really fricking fast
or
>a seperate CPU, GPU and RAM for each
>easy upgrades
>low price
>it's much slower than bleeding edge
you cannot have both
bleeding edge enterprise servers and becky's m3 macbook use the former and simply replace the entire computer when it's time to upgrade, so if you don't wanna do that you've gotta compromise
>10nm Intel still outperforms 5nm TSMC in everything else
The nanometers haven't corresponded with reality since about 2008.
it has always been about feature size
It was about the length of the gate. Also, there are no 3nm features either.
you should read up on why the M1 is good https://archive.is/c8r7Z the tldr is that Apple controls both OS and hardware makes extensive use of accelerators which combined with their aforementioned OS control leads to better performance for consumer computing. It will not beat the traditional stack in general computing but it comes very close especially when looking at power efficiency.
I wonder if Intel had their own OS could they achieve the same thing? Maybe but they will never do it since they cannot sell cpus that loose a good chunk of performance on windows.
That's not the Intel way, if they made an OS it will just perform like crap on non genuine-intel CPUs like they did with their compiler years ago.
>I wonder if Intel had their own OS
What is clearlinux
Intel didn't create Linux.
well I guess I worded it wrong. What I meant was what if Intel had a OS+Hardware platform similar to Apple? Could they achieve the same results on x86? I assume so. Its not going to happen either way. I did expect MS to do it but they already get money shoved up their ass so why bother?
No because AMD created long mode.
actual schizo shit
AMD destroys apple's meme cpus in efficiency and intel in performance
>E-cores
>the exact opposite of efficient
>weird uneven core/thread count
They are a good concept for increasing multicore performance at the same die space (4 E cores take the same amount of space as 1 P core) but you need 1. a very good scheduler and 2. E cores should have the same instructions as P
>3D cache
>89°C temp limit and its benefit diminishes with higher resolution gaming
It's still good even at higher resolutions in physics/AI heavy autism games and stuttery, unoptimized Unity slop
>10nm Intel still outperforms 5nm TSMC in everything else
Temperature and power draw (higher is better)
>Still no solution to automatically make use of multiple cores/threads
Schedulers have existed for a long time
>Becky's M3 MacBook has similar computing power as top-of-the-line x86 desktop chips
Not really
>Both Intel and AMD huffing and puffing to keep up with ARM
They could make ARM CPUs any time they want, anyone can purchase an ARM license, but they would be useless as Qualcomm still has the exclusive deal for ARM Windows. But I bet they at least have some prototypes in the lab.
I saw anons a few days ago discussing how CPU is better than GPU nowadays for every task when the future really is CUDA being god tier and it is only a matter of monotonous code work on each of your threads
Not too long ago rich troony also was telling me how the newest apple CPU will be the best xyz when they had no idea I did CS
I still have an i5 3570k overclocked one wholeish ghz, everything still runs on 4 threads for the most part, I can just write everything in CUDA for my used 1070ti and not give a shit
moronic pic, the current CPU market is far better than the shitheap that is the current GPU market.
yeah the cpus are actually seeing innovations with e cores and v cache
meanwhile gpus are just becoming bigger with beefier heatsinks
most of these are false
Completely different products designed through different architectures for different results. CPU's are doing just fine.
's M3 MacBook has similar computing power as top-of-the-line x86 desktop chips
get a toilet curryBlack person