Apple's chipset advantage has me more jealous than ever


Around two months ago, a rumor circulated that Google was hard at work on designing its own smartphone chipset. While gossip has circled the internet for years about a Google-built mobile chip, it was the firmest evidence yet that the Mountain Views company was giving serious consideration to a much more vertically integrated smartphone platform. No other reliable information has emerged in the meantime, though, and on a day like today, this all feels like something worth talking about.
Apple has announced it will transition the last of its product line not using Apple chipsets to the company's own processors. That means the MacBook, Mac Mini, iMac, and Mac Pro will all soon (relatively speaking—in the next two years) be powered by the same kind of ARM cores you'll find inside an iPhone, iPad, or an Apple Watch. They'll be bigger, faster, and more capable, but they'll be the same basic family of chips, enabling Apple to do everything from drastically reduce power consumption on its laptops, to greatly reducing the heat produced by its desktops, to (potentially) adding things like mobile data connectivity to Macs. The list of potential upsides is truly overwhelming; just think about everything a smartphone or tablet does better than a laptop, and you'll quickly begin to understand why the move makes so much sense.
One great example of such advantages came when Apple demonstrated that a console-class video game (Shadow of the Tomb Raider—granted, a game released in 2018) could run on an iPad Pro chipset in Mac OS. The demo wasn't perfect: the game ran at 1080p with fairly middling settings, but did so at what appeared to be at the very least a steady 30FPS, all while being run as an emulated x86 version of the game. That is: the game wasn't even natively compiled for ARM. But consider that Intel's most powerful laptop chipset GPU, found in the 10th generation Ice Lake series, is not capable of breaking single digit frame rates on this game at 1080p. Apple received some snark about this demo being lame, but it's only lame if you don't understand at all just how terrible modern integrated laptop GPUs are. Apple is using a—very likely fanless—system on a chip to run laps around the latest and greatest from Intel. There is nothing Intel makes that has a more powerful GPU. Oh, and all evidence suggests that it's doing this at under half the cost of using that Intel chipThat's not just impressive, it's absolutely nuts.
And this is where things start to look a little concerning for Google. It is increasingly clear that Apple is beginning to run away with its lead in the chipset wars, and that vendors like Intel and Qualcomm are simply unable to match Apple's single-minded obsession when it comes to silicon. This is, in part, because Qualcomm and Intel operate very different businesses than Apple. Apple has one customer: Apple. Apple only needs to build the chipset designs that Apples knows it wants for the limited portfolio of products it sells. Because those products sell at such immense scale, that shared chipset platform reaps huge returns on investment, because it can be applied widely across Apple's product range. Vendors like Qualcomm and Intel must seek to satisfy the needs of customers not just at different price points, but with different priorities and strategies. This makes maintaining a large array of chip options paramount, spreading resources across several families of products serving various families of customers.
For Google's hardware division, a group largely building aspirational and relatively high-end products it wants to define as an integrated ecosystem, this makes working with the likes of an Intel or a Qualcomm less than ideal. For example, when Google launched the Pixel Slate in 2018, Intel's Kaby Lake R CPUs had a difficult time with the tablet's high-density display, with endless reports of lag until Google "fixed" the issue in a number of updates. Even so, the tablet continued to have difficulties with things like 4K video playback and gaming, owing to the relatively graphics-poor Intel chipset. Last year, when Google launched the Pixelbook Go, I learned that Google was unhappy internally with the performance of the 4K model, with those complaints specifically attributed to the lack of horsepower and throttle-happy nature of Intel's i7 chip.
On the smartphone side, Google's relationship with Qualcomm remains outwardly pleasant, but there's little denying that Qualcomm's expensive smartphone chips have forced it to position its phones in a way that just compares poorly to Apple: not only are Google's Pixel phones slower in most key benchmarks than Apple's, even while cheaper they generally present a poorer value. Apple's latest iPhones feature more cameras, more storage options, better displays, faster charging, more powerful haptics and speakers, nicer materials, and do so while offering demonstrably superior performance, and in some cases far superior performance like the GPU. This isn't to say Google's phones don't have redeeming qualities—they absolutely do, like photo processing and on-device AI (and of course Android itself)—but that even if Google were to go punch for punch against Apple on features, they would still face an impossible barrier to break on the chipset side. There is no super secret chip Qualcomm can build that will actually be better than Apple's A-series chips, if only Google were to bankroll the effort. For example, Qualcomm's GPUs aren't just much less powerful than Apple's, they're massively less efficient. That's not something you can simply spend around, you need legitimate technological breakthroughs.
This isn't even to speak of wearables, where Qualcomm and Intel have basically let down Google's Wear OS platform wholesale (and for which I think Google is just as much to blame, frankly). Smartwatches simply don't ship in big enough numbers for these companies to commit the massive resources necessary to engineering cutting-edge, highly competitive solutions. They're going to build what works and what they believe their customers will buy.
Thinking about all of this, I came away from today's WWDC announcement feeling a bit jealous. Jealous that Apple's tightly-integrated hardware approach, coupled with its walled garden software one, would put legitimately useful features and integrations out of my reach unless I switched to iOS. Jealous that my (relatively) new laptop will probably have circles run around it on graphics, battery, and connectivity by whatever next-gen ARM MacBook Apple drops next year. Jealous that I still can't have a wearable for Android that doesn't suck.
I'm not claiming to have the business and technical insights into the viability of Google pursuing a more Apple-like model when it comes to designing and sourcing hardware. It may be Apple's lead on ARM chips is so far out that Google could never hope to catch up without spending truly laughable sums of money—money that it could never guarantee it would see a return on. Designing chips isn't like designing a sports car; you can't just look at the competition, raid the parts bin, and hope sure you hit a few key performance benchmarks. This is inanely complicated stuff being researched by people who are on the absolutely bleeding edge of computer science and electrical engineering. Still, Apple's proven it can be done. "Not impossible" isn't an easy bar to meet, but nonetheless: the bar exists.

Comments

Popular posts from this blog

Linktree’s free workaround lets you add multiple links to your Instagram bio

EVERYTHING WE KNOW ABOUT THE PIXEL 4, THE MOST-LEAKED PHONE EVER

Our favorite iOS apps for the iPhone and iPad in 2018