David Gerard brings up an interesting association: the crypto grifters, as their scam begins to disintegrate, have jumped ship to become AI grifters.
You’ll be delighted to hear that blockchain is out and AI is in:
- Sequoia Capital, who wrote a now-deleted glowing profile of FTX founder Sam Bankman-Fried, has gone full AI. See their home page. [Sequoia, archive of June 3]
- Crypto VC firm Paradigm has broadened its focus to include AI. [The Block]
- Circle’s Jeremy Allaire: “AI and blockchains are made for each other.” [Twitter]
- IBM: “The convergence of AI and blockchain brings new value to business.” IBM previously folded its failed blockchain unit into the unit for its failed Watson AI. [IBM]
It’s not clear if the VCs actually buy their own pitch for ChatGPT’s spicy autocomplete as the harbinger of the robot apocalypse. Though if you replaced VC Twitter with ChatGPT, you would see a significant increase in quality.
Huh. Interesting. I never trusted crypto, because everyone behind it was so slimy, but now they’re going to slime the AI industry.
Also interesting, though, is who isn’t falling for it. Apple had a recent shindig in which they announced all the cool shiny new toys for the next year, and they are actively incorporating machine learning into them, but they are definitely not calling it AI.
If you had watched Apple’s WWDC keynote, you might have realized the lack of mention of the term “AI”. This is in complete contrast to what happened recently at events of other Big Tech companies, such as Google I/O.
It turns out that there wasn’t even a single mention of the term “AI”. No, not even once.
The technology was referred to, of course, but always in the form of “machine learning” — a more sedate and technically accurate description.
Apple took a different route and instead of highlighting AI as the omnipotent force, they pointed to the features that they’ve developed using the technology. Here’s a list of the ML/AI features that Apple unveiled:
- Improved Autocorrect on iOS 17: Apple introduced an enhanced autocorrect feature, powered by a transformer language model. This on-device machine learning model improves autocorrection and sentence completion as users type.
- Personalized Volume Feature for AirPods: Apple announced this feature that uses machine learning to adapt to environmental conditions and user listening preferences.
- Enhanced Smart Stack on watchOS: Apple upgraded its Smart Stack feature to use machine learning to display relevant information to users.
- Journal App: Apple unveiled this new app that employs on-device machine learning to intelligently curate prompts for users.
3D Avatars for Video Calls on Vision Pro: Apple showcased advanced ML techniques for generating 3D avatars for video calls on the newly launched Vision Pro.
- Transformer-Based Speech Recognition: Apple announced a new transformer-based speech recognition model that improves dictation accuracy using the Neural Engine.
- Apple M2 Ultra Chip: Apple unveiled this chip with a 32-core Neural Engine, which is capable of performing 31.6 trillion operations per second and supports up to 192GB of unified memory. This chip can train large transformer models, demonstrating a significant leap in AI applications.
Unlike its rivals, who are building bigger models with server farms, supercomputers, and terabytes of data, Apple wants AI models on its devices. On-device AI bypasses a lot of the data privacy issues that cloud-based AI faces. When the model can be run on a phone, then Apple needs to collect less data in order to run it.
It also ties in closely with Apple’s control of its hardware stack, down to its own silicon chips. Apple packs new AI circuits and GPUs into its chips every year, and its control of the overall architecture allows it to adapt to changes and new techniques.
Say what you think of Apple as a company, but one thing they know how to do is make money. Lots of money. They also have first-rate engineers. Apparently they are smart enough to not fall for the hype.