Bracing for a consumer electronics supercycle
Thinking about a new phone, tablet, or laptop? Maybe wait a year.
I think we’re just on the cusp of a “supercycle” in consumer electronics (specifically desktops, laptops, tablets, and smartphones) the likes of which we’ve not seen for two decades.
What do I mean by a “supercycle?” Well, to get technical: I mean a lot of people are going to buy a lot of shit, because the new shit works better than their old shit.
And, yep, it’s driven by AI.
I’ll explain, but I’ll let you in on the takeaway right from the start: with a few exceptions, the 2024 holiday season is not a great time to buy a new computer, tablet, or smartphone for your home office.
Your gear can’t handle AI
Generative AI models, including large language models (LLMs) and diffusion models, pose interesting challenges for consumer electronics. They work best on hardware capable of executing many tasks in parallel. They also require a lot of fast memory.
Let’s say I want to run one of Meta’s smaller LLMs, like Llama 3.1-8B, on a laptop. This is easy to do with apps like LM Studio or Ollama. Just download, press a few buttons, and I’ve got AI.
But even Llama 3.1-8B (which, again, is a relatively small model) chugs on a Microsoft Surface Laptop with a Qualcomm Snapdragon X Elite and 32GB of RAM. Getting a response can take 10 or 20 seconds, or more.
Think about that. A flagship Copilot+ PC, a laptop marketed exactly on its AI features, struggles to handle a relatively small LLM.
And, by the way, Llama 3.1-8B kind of sucks.
The “8B” stands for “8 billion parameters.” To say more parameters is better is a huge simplification, but at a basic level, it works. The best version of Llama 3.1 has 405 billion parameters. GPT-4 probably has over a trillion.
So, yea, you can see the problem. A rather bad large language model challenges a flagship laptop that’s quicker, and has more memory, than the vast majority of computers, tablets, and smartphones people own right now.
Your gear will need to handle AI
At this point you might say: ok, but I don’t use chatbots. Why should I care?
The answer can be found in new AI features like Apple Intelligence and Microsoft Copilot. These include text and image generation, but they do a lot more than that. Because it turns out that modern AI is just better than past methods for a number of important features, including:
Upscaling and processing images
Understanding written or spoken language
Translation
Real-time video effects
Search
Contextual menus and recommendations
And on, and on.
Apple and Microsoft are already baking these features into their operating systems and will continue to do so for years to come. Third-party developers are likely to follow their lead.
Will these AI features be absolutely mandatory to, like, open an email? No. They will become common, however. Unless you fancy moving your workflow to Linux (which, hey, that’s a valid option), Apple and Microsoft already bought your ticket for this ride
In many cases, the “AI” will be imperceptible. Microsoft’s AI glow-up for Windows Search is a good example. I’m sure Microsoft’s announcement of the change coming to Copilot+ PCs will be the last time anyone gives it a thought.
Unless it fails spectacularly which, given Microsoft’s track record with Windows updates, is an option. But anyway…
Apple Intelligence is similarly reserved to recent Apple devices, with older iPhones getting the shaft. The iPhone 15 Pro is the only older iPhone that will support it. So if you want a version a Siri the actually understand what you’re saying, or AI-generated emoji, or notification summaries, you’ll need to buy a new iPhone.
This all may sound very theoretical, especially for those who’ve yet to use AI.
But the fact Microsoft and Apple are already reserving AI features to cutting-edge computers, tablets, and smartphones should be a red flag. The features they’re adding right now aren’t especially numerous, or complicated, compared to what is likely to come next.
As these features continue to roll out, I think we’ll see the minimum requirements for new Apple and Microsoft operating systems to rise with unexpected speed. Many less capable devices will be left out of new OS updates, or run a variant of the OS with new features stripped out.
So, what the heck do you buy?
That implies the computers, smartphones, and tablets we own right now will become obsolete more quickly than we’re used to.
If that happens, it’s going to be quite a shock. Back in the 80s and 90s, it was common for technology to become obsolete in a year or two. But consumer tech hasn’t worked like that for a while.
Devices released next year will fare better. Probably. I expect, at least, they’ll have reasonable NPUs. RAM might be a steeper challenge, because it tends to be expensive to increase the RAM in a device.
So, what’s my advice?
First, if you don’t have a need, or strong desire, to buy a new device this year: maybe skip it. I’m not saying absolutely don’t do it. I just bought an iPhone. But if you’re on the fence, it’s fine to stay there.
Second, if you do buy a device this year, make sure it has a passable neural processing unit (NPU). I’d say 35 TOPs or more.
Third, go for that RAM upgrade, if one is available. RAM is important for handling large models. I’d really look to buy at least 32GB for a computer. If you’re looking at an iPad Pro and really do care about its AI performance, be aware the 256GB/512GB models have just 8GB of RAM, and the 1TB/2TB models have 16GB. With phones, you usually don’t have much choice.
I imagine devices that marketing themselves on their AI features, like the iPhone and iPad, the Google Pixel 9 line-up, and Windows’ Copilot+ PCs, are an adequate bet. I still think they look a bit under-equipped in terms of RAM, but there’s also ways to minimize the memory AI features require.
What I definitely wouldn’t buy, though, is:
An older Apple/Android phone or tablet, or a Mac or Windows laptop that has less than 32GB of RAM
Or a device with an NPU that quotes a performance metric below 35 TOPs.
Desktops with GPUs may sorta-kinda side-step the NPU requirement, though that’s unclear. At the moment Windows won’t run AI features on an Nvidia or AMD desktop GPU, but Nvidia says it’ll happen by the end of the year.
Side note: TOPs sucks as a metric, because there’s different ways to measure it. But at the moment, it’s what we’ve got. Don’t rely on an NPU quoting 50 TOPs to beat the snot out of one quoting 45 TOPs. But if one quotes 50 TOPs and other quotes 10 TOPs, well, the former probably is better.
Anyway…that’s all I’ve got to say on the topic. It’s evolving rapidly, really, so I think it’s important to give people in the market for a new computer, phone, or tablet the heads-up. Feel free to leave a comment if you’ve got any questions.