Isn't this the kind of thing that ultimately holds back technology? If every new standard has to support the old stuff too, how will things ever advance beyond the previous generation
Are you a 3D artist? no, do you even have any artsy skill? no
ok
2 years ago
Anonymous
Why does he have to be a 3D artist to be able to critique Apple not having RT cores? It's a genuine miss especially with Apple doing AR/VR and investing into Blender. Without any sort of RT, Apple will keep losing to Nvidia Laptop RTX 3050s in rendering speed even with Metal and the M1 Ultra.
RTX hardware is basically an "is this line intersecting those triangles" accelerator.
There's a FRICKTON of science that can be done with that.
But scientists only use PCs anyway, so it's not of concern.
ray-tri is only one piece of the puzzle. There's a lot more including ray-box, BVH traversal, bucket sort RT, motion blur acceleration (kinda a stupid feature but it exists on RTX 3000), and that's just what we know about and exists right now
>no AV-1 decoding >but 8K x264, x265 and ProRes encode and decode
How can apple say with a straight face they are trying to relentlessly pursue efficient performance while doing this? What does a sub-1440p ultrabook need the ability to encode and decode 8K for? But I can very easily give a good reason to have HW decode for AV-1, pic related
also no ray tracing HW despite metal having ray tracing support for years now, what the frick?
all in all there's a lot to like but also a lot of "what the frick were they smoking?"
>>also no ray tracing HW despite metal having ray tracing support for years now, what the frick? > >all in all there's a lot to like but also a lot of "what the frick were they smoking?"
Not defending apple because honestly I don't give a frick about tech companies but could apple have not included their own tensor cores because they're inefficient performance per watt? Don't nvidia chips chew through watts?
>but could apple have not included their own tensor cores because they're inefficient performance per watt? Don't nvidia chips chew through watts?
Apple DOES include their own "tensor cores", and they have for a while. They call it the "Neural Engine"
Nvidia's performance/watt woes are mostly from using GDDR6X and their aggressive boosting behavior
>Don't nvidia chips chew through watts?
Yes but you're getting just as much raw performance out of it. Apple's M1 max only matches a 3080 Adobe premier, and it's not entire because of the GPU performance.
>but could apple have not included their own tensor cores because they're inefficient performance per watt? Don't nvidia chips chew through watts?
Apple DOES include their own "tensor cores", and they have for a while. They call it the "Neural Engine"
Nvidia's performance/watt woes are mostly from using GDDR6X and their aggressive boosting behavior
NVIDIA's core is actually quite efficient they just overvolt the shit out of it. Same with RDNA2. The Steam Deck proved that. RDNA2's perf/watt is actually pretty insane.
Still limited to 2 displays total.
RAM can now go to 24GB instead of the original ceiling of 16GB.
It's criminal that an 8GB option still exists and is the default.
It's criminal that you still only get ONE external display.
24GB RAM ceiling is a joke. 32GB should have been an option.
You can't convince me that a Pro laptop shouldn't be able to have 32GB of RAM. In 2022.
The new MacBook Pro 13 is a lie. It is not a Pro. They should have renamed it to simply MacBook. Or the MacBook Air Fans Edition. Because that is what it is: a MacBook Air with Fans.
They didn't even change or update the industrial design of the Pro. The new design of the Air is at least a change. Those borders around the Pro screen are criminally large compared to the Air. Jony Ive's ghost still haunts Apple it seems.
Like the original base M1, the base M2 is a hobbled chip that belongs in an iPad. Stay away.
Wait for the upgraded M2-derived chips. Like M2 Pro/Max/Ultra. Or don't wait and just buy a M1-Pro/Max/Ultra machine that has features you need like larger RAM or more displays.
>Can't support more than 1 external >8k h.264 for some reason >No raytracing
In any case I'm glad I didn't wait for m2. 19% cpu speed improvement after two years really isn't all that, I don't honestly even really care that much about CPU speed anymore. It has no appreciable impact on my quality of experience or my productivity. Honestly computers have just gotten to the point I really just have no need to care about the latest and greatest.
Seems mediocre as expected.
I honestly wasn't expecting them to break the 16gb limit and 24gb is quite a bit for an SoC that can only power one external (less VRAM usage). They probably did it for the sake of 8k encoding. I think by the way that's pretty much a confirmation that the iPhone 14 pro records in 8k.
Honestly I think it makes nerds angrier than it actually matters, incredibly few users can significantly benefit from >24gb memory and they also sell the MBP. The chip is fine at what it's trying to be, 24gb memory limit is the least of its problems compared to being ARM or supporting a single fricking external.
>no AV-1 decoding >but 8K x264, x265 and ProRes encode and decode
How can apple say with a straight face they are trying to relentlessly pursue efficient performance while doing this? What does a sub-1440p ultrabook need the ability to encode and decode 8K for? But I can very easily give a good reason to have HW decode for AV-1, pic related
also no ray tracing HW despite metal having ray tracing support for years now, what the frick?
all in all there's a lot to like but also a lot of "what the frick were they smoking?"
>What does a sub-1440p ultrabook need the ability to encode and decode 8K for?
Iphone 14 Pro
Slower than any industry grade X86 machine. Doesn't run any industry standard software. Congratulations applegays, if your shit wasn't linux tier before it is now.
As always, anyone with work to do uses a windows machine. Apple is a lifestyle company and nothing more. No one has a real reason to use apple shit over windows (the official OS of having a job) unless forced to by iOS development requirements.
I have no idea what "Industry grade" means and the machine is as fast as anything on the market at its intended purpose, browsing Facebook. This is a consumer computer, I'm buying one for my stepfather. Technically it's marginally faster, but not to a degree that anybody sane would care. I have a machine filled with "industry standard" and I have yet to run into any of my tools not working.
You're being vague, unspecific, and lazy because you're more concerned about blowing out crapple than spending 5 seconds critically analysing the system.
This is M1 Plus. Not M2. Same 5nm process as iPhone 12>13, 15% better performance or same performance and 15% more efficient, and they took the performance gains with the mac and slapped on an additional memory controller to allow for more ram, iPhone took the efficiency gains and got two day battery life. This advanced process allowed for more gpu power but not substantially more gpu power. These MacBook's will most likely also have worse battery life than the m1 machines they're replacing. The latest rumors look to be true: M2 actually debuts next year via M2 Pro/M2 Max/M2 Ultra with an advanced 3nm process in the new MacBook pro, iMac pro, and mac mini/studio/pro. The original apple silicon rumors have been 99% correct, and M2 was supposed to have a 12 core gpu, allowing for dual display support. Now another year will go by where you cannot buy a mac for less than $2K that out of the box supports more than 1 display. A crime, thank god displaylink exists.
True, this is kind of the shit generation between the m1 and the m3.
>still not socketable >still can't upgrade ram or storage >still no pci lanes
it does not matter how it performs, it's garbage totalitarian unusable trash
>Upgrade
But then apple can't overcharge for the higher tier models >Replace parts
Apple already has this problem with some well made generations of MBPs and iPhones working 12 years later. they have to blacklist them in OS updates and installers to stop midwits and morons from using them as long as they works and robbing apple of sales.
If this pattern keeps up:
Tick-Tock evolution might exist for Apple Silicon.
2020 Tick. Original M1
2021 Tock. Evolved M1 (Pro/Max/Ultra)
2022 Tick. Original M2.
>Apple’s new M2 processor is mostly an update to the M1, rather than a successor. That mainly comes down to the manufacturing process M2 is built on. Chipmaker TSMC is behind manufacturing for the M1 and M2, and Apple says the M2 comes with a “second-generation 5nm” node.
>For TSMC, which is by far the world’s largest semiconductor company, a full node improvement is what you’re looking for between CPU generations. That means shrinking the manufacturing process to fit more transistors on the chip while improving efficiency. The problem is that TSMC delayed its next-gen node in 2021, and it appeared to be a prime candidate for Apple’s M2.
>The M1 is built on TSMC’s N5 node, and the M2 will almost certainly use the N5P node. The true next-gen node is N3, which is a 3nm process that delivers up to 15% higher performance and 30% lower power draw versus N5. By comparison, N5P is a 7% improvement with 15% less power draw.
The point is x86 emulation performance of things that don't play nice with rosetta 2 (ie mostly everything). You can think of GTA V as a baseline to compare everything to.
People fricking rub their wieners fricking raw to GTAV benchmarks as if Apple users spend all their time playing Video games, things Apple users are well known for doing, buying their Macs to play video games on and all.
The bigger issue is that rosetta is so ungodly buggy that having a SINGLE rosetta II process can frick the audio subsystem into disrepair before rebooting. Of course IQfy doesn't know what the frick they're talking about so they don't talk about actual problems Macs have, they just think that Apple users use high performance software that isn't apple silicon native all day and drool onto their keyboards going "WHY THIS SLOWW??!!" and don't have a realistic impression of who uses these fricking computers. Nobody is playing GTAV on these fricking computers, they're running some shitware driver or enterprise crapware that is rosetta only and it causes the Mac to become so fricking unstable you have no choice but to reboot to resolve the massive memory leak or whatever the frick happens because Apple is a piece of shit fricking shit company.
2 years ago
Anonymous
It's not the game, it's the x86 emulation performance. You DO realize you can't just slap the rosetta 2 band aid to everything and the majority of devs would rather drag their nutsacks over a mile of broken glass before porting their x86 software to arm when x86 emulation exists, right?
2 years ago
Anonymous
Yes numbnuts, I understand your obsession
"emulation is SSLOWWWW! MAChomosexualS BTFFOO!!!!"
You guys have been repeating it endlessly for two fricking years and literally cannot comprehend anybody, ANYBODY who was running any application remotely performance sensitive either waited for the apple silicon port before buying the Mac or just bought Windows because people aren't nearly as moronic as IQfy users who project their own moronation onto other computer users. What a profound point numbnuts, you shouldn't buy a computer to run performance sensitive software if you're going to be emulating it all the time, no fricking shit!
Anybody who actually bought crapple who doesn't have severe brain damage is running basically all native software all the fricking time and maybe like 5 fricking rosetta 2 processes in the background that use like 0.1% CPU but still manage to destroy your computer somehow. The majority of devs would rather drag their nutsacks over a mile of broken glass instead of port?
Look at the shit that has been ported.
https://isapplesiliconready.com/for/productivity
IDEs? Mostly ported.
Music production? Mostly ported.
Video production? Mostly ported.
Photo editing? Mostly ported.
Random productivity tools? Mostly ported, and many of them run in safari anyways, and they're not performance sensitive.
I'm using exactly 5 processes that are not apple silicon native, 2 of them were ported already but I'm too lazy to reinstall to update, the remaining ones are Logitech shitware they will never update, some enterprise cloud sync shitware I am actively working to replace since it's redundant and terrible, and Toggl track. These cumulatively use 1% of the CPU. I swear to god reading IQfy you would think my computer is fricking bursting into flame fricking straining under the load of the massive amounts of rosetta emulated heavy duty software, the porting effort having been a complete and utter failure.
2 years ago
Anonymous
A lot of those ports are genuinely SLOWER than x86 emulation.
>t. returned my m1 air a week later
2 years ago
Anonymous
I don't see GTA IV let alone GTA V anywhere in there.
Slower than any industry grade X86 machine. Doesn't run any industry standard software. Congratulations applegays, if your shit wasn't linux tier before it is now.
As always, anyone with work to do uses a windows machine. Apple is a lifestyle company and nothing more. No one has a real reason to use apple shit over windows (the official OS of having a job) unless forced to by iOS development requirements.
This is M1 Plus. Not M2. Same 5nm process as iPhone 12>13, 15% better performance or same performance and 15% more efficient, and they took the performance gains with the mac and slapped on an additional memory controller to allow for more ram, iPhone took the efficiency gains and got two day battery life. This advanced process allowed for more gpu power but not substantially more gpu power. These MacBook's will most likely also have worse battery life than the m1 machines they're replacing. The latest rumors look to be true: M2 actually debuts next year via M2 Pro/M2 Max/M2 Ultra with an advanced 3nm process in the new MacBook pro, iMac pro, and mac mini/studio/pro. The original apple silicon rumors have been 99% correct, and M2 was supposed to have a 12 core gpu, allowing for dual display support. Now another year will go by where you cannot buy a mac for less than $2K that out of the box supports more than 1 display. A crime, thank god displaylink exists.
>no AV1 support
>no hardware ray tracing
>in 2022
Good luck, Apple.
qualcomm doesn't support that israelitegle shit codec either
VVC is coming, AV1=DOA
>VVC is coming
>proprietary, royalty-encumbered, locked down codec
Nah, I can't do it.
>proprietary
the reference encoder and decoder are open source
VVC is worse than AV1 but, Okay
>No AV1 support
>No VVC support
>No hardware ray tracing
>In 2022
nah
]
kek
lmao
wake me up when its compatible with x86
Wake up, Rosetta 2 exists.
Isn't this the kind of thing that ultimately holds back technology? If every new standard has to support the old stuff too, how will things ever advance beyond the previous generation
It is needed.
"congratulations, you have to rebuy every piece of software you ever had and only some exist" is a death sentence to any new technogy.
just don't be poor
i prefer the M3
Based IQfy-tist
It's useless anyway, only gaymers maybe and still make gaymes shit.
3D artists use RTX to speed up rendering.
https://code.blender.org/2019/07/accelerating-cycles-using-nvidia-rtx/
Are you a 3D artist? no, do you even have any artsy skill? no
ok
Why does he have to be a 3D artist to be able to critique Apple not having RT cores? It's a genuine miss especially with Apple doing AR/VR and investing into Blender. Without any sort of RT, Apple will keep losing to Nvidia Laptop RTX 3050s in rendering speed even with Metal and the M1 Ultra.
RTX is a godsend for me
sauce on image pls
looks more like a satansend to me
touch grass
touch grass
RTX hardware is basically an "is this line intersecting those triangles" accelerator.
There's a FRICKTON of science that can be done with that.
But scientists only use PCs anyway, so it's not of concern.
ray-tri is only one piece of the puzzle. There's a lot more including ray-box, BVH traversal, bucket sort RT, motion blur acceleration (kinda a stupid feature but it exists on RTX 3000), and that's just what we know about and exists right now
>releasing two new laptops in 2022 that can only have 1 external display.
>pretending laptops are relevant when we have desktops and tablets
laptop is a poorgay compromise
>IQfy supporting tablets
Haven’t been here in years and holy shit you homosexuals went full moron
Most likely other boards immigrants, give it a week, happens every apple event.
>Haven’t been here in years
And it shows
I think I’m better for it if being here means you turn into an apple dick sucking tablet user .
nothing wrong with sucking dicks
Agreed but sucking corporate dick is bad.
Sucking the dick of people that don't care about you is bad
>no AV-1 decoding
>but 8K x264, x265 and ProRes encode and decode
How can apple say with a straight face they are trying to relentlessly pursue efficient performance while doing this? What does a sub-1440p ultrabook need the ability to encode and decode 8K for? But I can very easily give a good reason to have HW decode for AV-1, pic related
also no ray tracing HW despite metal having ray tracing support for years now, what the frick?
all in all there's a lot to like but also a lot of "what the frick were they smoking?"
>>also no ray tracing HW despite metal having ray tracing support for years now, what the frick?
>
>all in all there's a lot to like but also a lot of "what the frick were they smoking?"
Not defending apple because honestly I don't give a frick about tech companies but could apple have not included their own tensor cores because they're inefficient performance per watt? Don't nvidia chips chew through watts?
I think this is reasonable unlike only having one external display .
>but could apple have not included their own tensor cores because they're inefficient performance per watt? Don't nvidia chips chew through watts?
Apple DOES include their own "tensor cores", and they have for a while. They call it the "Neural Engine"
Nvidia's performance/watt woes are mostly from using GDDR6X and their aggressive boosting behavior
>Don't nvidia chips chew through watts?
Yes but you're getting just as much raw performance out of it. Apple's M1 max only matches a 3080 Adobe premier, and it's not entire because of the GPU performance.
NVIDIA's core is actually quite efficient they just overvolt the shit out of it. Same with RDNA2. The Steam Deck proved that. RDNA2's perf/watt is actually pretty insane.
Modest upgrade over the original M1.
Very Modest.
1080p webcams finally.
Still limited to 2 displays total.
RAM can now go to 24GB instead of the original ceiling of 16GB.
It's criminal that an 8GB option still exists and is the default.
It's criminal that you still only get ONE external display.
24GB RAM ceiling is a joke. 32GB should have been an option.
You can't convince me that a Pro laptop shouldn't be able to have 32GB of RAM. In 2022.
The new MacBook Pro 13 is a lie. It is not a Pro. They should have renamed it to simply MacBook. Or the MacBook Air Fans Edition. Because that is what it is: a MacBook Air with Fans.
They didn't even change or update the industrial design of the Pro. The new design of the Air is at least a change. Those borders around the Pro screen are criminally large compared to the Air. Jony Ive's ghost still haunts Apple it seems.
Like the original base M1, the base M2 is a hobbled chip that belongs in an iPad. Stay away.
Wait for the upgraded M2-derived chips. Like M2 Pro/Max/Ultra. Or don't wait and just buy a M1-Pro/Max/Ultra machine that has features you need like larger RAM or more displays.
Base M chips are Apple’s Celeron, the real deal will be M2 Pro and beyond
no, more like
base M = i3
Pro = i5
Max = i7
Ultra = i9
>Can't support more than 1 external
>8k h.264 for some reason
>No raytracing
In any case I'm glad I didn't wait for m2. 19% cpu speed improvement after two years really isn't all that, I don't honestly even really care that much about CPU speed anymore. It has no appreciable impact on my quality of experience or my productivity. Honestly computers have just gotten to the point I really just have no need to care about the latest and greatest.
Seems mediocre as expected.
I honestly wasn't expecting them to break the 16gb limit and 24gb is quite a bit for an SoC that can only power one external (less VRAM usage). They probably did it for the sake of 8k encoding. I think by the way that's pretty much a confirmation that the iPhone 14 pro records in 8k.
Honestly I think it makes nerds angrier than it actually matters, incredibly few users can significantly benefit from >24gb memory and they also sell the MBP. The chip is fine at what it's trying to be, 24gb memory limit is the least of its problems compared to being ARM or supporting a single fricking external.
>What does a sub-1440p ultrabook need the ability to encode and decode 8K for?
Iphone 14 Pro
I have no idea what "Industry grade" means and the machine is as fast as anything on the market at its intended purpose, browsing Facebook. This is a consumer computer, I'm buying one for my stepfather. Technically it's marginally faster, but not to a degree that anybody sane would care. I have a machine filled with "industry standard" and I have yet to run into any of my tools not working.
You're being vague, unspecific, and lazy because you're more concerned about blowing out crapple than spending 5 seconds critically analysing the system.
True, this is kind of the shit generation between the m1 and the m3.
$1200 minimum to browse facebook? get a grip
Still only 1 external display
Is it armv9?
Don't know but it's SUuuuuUuuuPeeeEerrr efficient for those intense web browsing sessions.
until you try to watch youtube or netflix and get buttfricked with software AV-1 decode
Na it's alright. AV1 runs fine on a dual core i5 skylake.
>still not socketable
>still can't upgrade ram or storage
>still no pci lanes
it does not matter how it performs, it's garbage totalitarian unusable trash
>Upgrade
But then apple can't overcharge for the higher tier models
>Replace parts
Apple already has this problem with some well made generations of MBPs and iPhones working 12 years later. they have to blacklist them in OS updates and installers to stop midwits and morons from using them as long as they works and robbing apple of sales.
I’ll wait for the benchmarks but the 20% bump in price and the base model only having 8GB ram is effectively pushing me toward a base M1 Pro
I dread a Mac mini with a base M2.
Still not enough RAM.
Still only 2 displays.
>I dread a Mac mini with a base M2.
>Still not enough RAM.
>Still only 2 displays.
For that matter an iMac with a base M2 is also a nightmare.
2024 is now looking like the year when Apple might un-hobble the base Apple Silicon. M3 might finally give 3 displays and 32GB of RAM.
If this pattern keeps up:
Tick-Tock evolution might exist for Apple Silicon.
2020 Tick. Original M1
2021 Tock. Evolved M1 (Pro/Max/Ultra)
2022 Tick. Original M2.
>not ready for desktop
huurrrffffdurrrrfff
no
>Apple’s new M2 processor is mostly an update to the M1, rather than a successor. That mainly comes down to the manufacturing process M2 is built on. Chipmaker TSMC is behind manufacturing for the M1 and M2, and Apple says the M2 comes with a “second-generation 5nm” node.
>For TSMC, which is by far the world’s largest semiconductor company, a full node improvement is what you’re looking for between CPU generations. That means shrinking the manufacturing process to fit more transistors on the chip while improving efficiency. The problem is that TSMC delayed its next-gen node in 2021, and it appeared to be a prime candidate for Apple’s M2.
>The M1 is built on TSMC’s N5 node, and the M2 will almost certainly use the N5P node. The true next-gen node is N3, which is a 3nm process that delivers up to 15% higher performance and 30% lower power draw versus N5. By comparison, N5P is a 7% improvement with 15% less power draw.
https://www.digitaltrends.com/computing/apple-m2-not-next-gen/
Didn't 3DO made the M2 back in the 90's?
Depends on how well it does on the GTA V benchmark. M1 didn't do to well.
>running x86 emulation on top of windows emulation on a 35W part vs a 500W+ desktop
nvidiots everyone
The point is x86 emulation performance of things that don't play nice with rosetta 2 (ie mostly everything). You can think of GTA V as a baseline to compare everything to.
People fricking rub their wieners fricking raw to GTAV benchmarks as if Apple users spend all their time playing Video games, things Apple users are well known for doing, buying their Macs to play video games on and all.
The bigger issue is that rosetta is so ungodly buggy that having a SINGLE rosetta II process can frick the audio subsystem into disrepair before rebooting. Of course IQfy doesn't know what the frick they're talking about so they don't talk about actual problems Macs have, they just think that Apple users use high performance software that isn't apple silicon native all day and drool onto their keyboards going "WHY THIS SLOWW??!!" and don't have a realistic impression of who uses these fricking computers. Nobody is playing GTAV on these fricking computers, they're running some shitware driver or enterprise crapware that is rosetta only and it causes the Mac to become so fricking unstable you have no choice but to reboot to resolve the massive memory leak or whatever the frick happens because Apple is a piece of shit fricking shit company.
It's not the game, it's the x86 emulation performance. You DO realize you can't just slap the rosetta 2 band aid to everything and the majority of devs would rather drag their nutsacks over a mile of broken glass before porting their x86 software to arm when x86 emulation exists, right?
Yes numbnuts, I understand your obsession
"emulation is SSLOWWWW! MAChomosexualS BTFFOO!!!!"
You guys have been repeating it endlessly for two fricking years and literally cannot comprehend anybody, ANYBODY who was running any application remotely performance sensitive either waited for the apple silicon port before buying the Mac or just bought Windows because people aren't nearly as moronic as IQfy users who project their own moronation onto other computer users. What a profound point numbnuts, you shouldn't buy a computer to run performance sensitive software if you're going to be emulating it all the time, no fricking shit!
Anybody who actually bought crapple who doesn't have severe brain damage is running basically all native software all the fricking time and maybe like 5 fricking rosetta 2 processes in the background that use like 0.1% CPU but still manage to destroy your computer somehow. The majority of devs would rather drag their nutsacks over a mile of broken glass instead of port?
Look at the shit that has been ported.
https://isapplesiliconready.com/for/productivity
IDEs? Mostly ported.
Music production? Mostly ported.
Video production? Mostly ported.
Photo editing? Mostly ported.
Random productivity tools? Mostly ported, and many of them run in safari anyways, and they're not performance sensitive.
I'm using exactly 5 processes that are not apple silicon native, 2 of them were ported already but I'm too lazy to reinstall to update, the remaining ones are Logitech shitware they will never update, some enterprise cloud sync shitware I am actively working to replace since it's redundant and terrible, and Toggl track. These cumulatively use 1% of the CPU. I swear to god reading IQfy you would think my computer is fricking bursting into flame fricking straining under the load of the massive amounts of rosetta emulated heavy duty software, the porting effort having been a complete and utter failure.
A lot of those ports are genuinely SLOWER than x86 emulation.
>t. returned my m1 air a week later
I don't see GTA IV let alone GTA V anywhere in there.
Slower than any industry grade X86 machine. Doesn't run any industry standard software. Congratulations applegays, if your shit wasn't linux tier before it is now.
As always, anyone with work to do uses a windows machine. Apple is a lifestyle company and nothing more. No one has a real reason to use apple shit over windows (the official OS of having a job) unless forced to by iOS development requirements.
This is M1 Plus. Not M2. Same 5nm process as iPhone 12>13, 15% better performance or same performance and 15% more efficient, and they took the performance gains with the mac and slapped on an additional memory controller to allow for more ram, iPhone took the efficiency gains and got two day battery life. This advanced process allowed for more gpu power but not substantially more gpu power. These MacBook's will most likely also have worse battery life than the m1 machines they're replacing. The latest rumors look to be true: M2 actually debuts next year via M2 Pro/M2 Max/M2 Ultra with an advanced 3nm process in the new MacBook pro, iMac pro, and mac mini/studio/pro. The original apple silicon rumors have been 99% correct, and M2 was supposed to have a 12 core gpu, allowing for dual display support. Now another year will go by where you cannot buy a mac for less than $2K that out of the box supports more than 1 display. A crime, thank god displaylink exists.
>20 HOUR BATTERY LIFE WATCHING VIDEOS, WINDOWS BTFO*
*only when playing VP9 since AV1 isn't hardware accelerated
Aren't they dropping Rosetta 2 support soon?
>is M2 is the best out there?
yes if you can get by with a Raspberry Pi with a bunch of specialized coprocessors built in