Apple Can Save AI
Manifesto for AI Privacy in a Surveillance Economy
Long ago, tech companies and users lived in harmony. Everything changed when Advertising became the main business model. Tech companies mastered the power of algorithms and AI to exploit their users. They sold them to advertisers. They mined and exploited their behaviors. They created intentionally addictive products. They subverted democracies and societies for profit.
Only Apple still sold products to customers. Only they could return the potential of AI to helping users. But when AI was at its most hyped, Apple seemed to have vanished. But I believe Apple can save the world.
AI Isn’t About Tech Leadership
It doesn’t have to be Apple. It could be any company that:
- is a privacy company because they are not dependent on surveillance capitalism
- users trust as an intermediary between them and their services
- provides private storage that users control
Apple has no technological edge in the AI arms race. Their main difference is their business model. They have a consumer edge.
Most companies are using AI to package and sell their users to advertisers. Apple’s business is about selling hardware to consumers. More than half of their income is from selling iPhones: mobile devices that offer security, local storage, private clouds, and act as a trusted and personal intermediary between the user and the services they access.
Apple does have an $85B services market, but it is built on and built to sell the Apple hardware ecosystem. Apple users own their devices and their data. They pay a premium for this ownership and privacy.
Apple is positioned to sell AI to users instead of users to advertisers.
Privacy Takes Effort
We have the tools for privacy, but they take a lot of effort. Humans are not naturally good at security. Ease of use and convenience trumps paranoia.
Computers already require too much admin without having to manage passwords, set up proxies, understand various sites and their privacy settings, and create disposable emails. People are not clearing caches, managing trackers, and staying on top of user data leaks.
The tools exist, but most people don’t have the time or inclination to use them; let alone keep them up to date.
Siri as Your Personal Data Intermediary
Almost three-quarters of the world’s smartphones are Apple, each one running Apple’s virtual assistant, Siri. Where humans are not good at the tedious and repetitive work of protecting their privacy, computers are.
Apple owns the biggest footprint for personal agents on devices. User’s data is either stored locally on their devices or in private or temporary clouds that Apple doesn’t mine.
Instead of understanding and setting privacy settings for all your various apps and websites, you set them once with Siri. Unlike products that rely on surveillance capitalism, Siri wants to make your settings simple to manage and clear about what they do.
Instead of letting apps and sites store data about you, all the data is stored on your devices. When a third-party app wants to access your data, it requests it from Siri, who acts as your personal data intermediary and enforces your global privacy settings.
First, we need to solve two things:
- how to make Siri smarter?
- how to enforce your privacy?
LLMs and Smarter Virtual Assistants
The biggest problem with Siri is that it hasn’t been very good at understanding humans. Siri + LLM would apply the Retrieval-augment generation (RAG) model to Siri and your Apps.
The problem with chatbots like ChatGPT is that they can only know things that happened before their training date, the so-called “knowledge cutoff date.” The public version of ChatGPT 3 doesn’t know anything after January 2022.
Training a new version of an LLM can take months and is very expensive. To get around it, we connect LLMs to search systems which allow them to pull in more current information. They can use their language processing to form queries and parse information on the fly, extending their knowledge set.
For virtual assistants, use RAG to extend not just knowledge, but also capabilities. Siri can connect to and invoke your apps. LLM-based APIs will enable Siri to “absorb” the capabilities of your applications.
You can extend and customize your Siri’s abilities simply by installing apps. Using RAG patterns, Siri will search and talk to your apps on your behalf.
Privacy Built In
Siri, your virtual assistant, acts as your representative. Each app is an untrustworthy agent that Siri engages with. While you cannot trust the apps and their makers to respect your privacy, you can trust your assistant.
It doesn’t have to be Siri or Apple. It just cannot be a personal assistant who is spying on you for someone else.
Your assistant enforces your privacy by acting as your private data intermediary. All systems that want to access your private data must go through your assistant.
You know have one place to manage your global privacy preferences. You have one place to manage what data is tracked and what people can access. Your assistant works for you, and will not bury your choices under a pile of incomprehensible T&Cs.
- Only your assistant has reliable access to your private data.
- Your assistant authenticates your identity for remote services.
- Your assistant only stores data in your personal devices or private cloud.
- Any third party that wants temporary access to your data must request it from your assistant.
- Any third party that wants to remember something about you, must ask your assistant to remember it for them; and thus you manage your data, not them.
- Any access to your data can be revoked at any time.
- You own all the data and can examine it, remove it, or alter it as you see fit.
Personal Data Intermediatary
How can this work? How can we trust third parties to follow the rules?
We can’t, so we don’t. We only trust our assistants.
Survaliance Capitalism services want to track your identity. As they monitor your behavior, they record what they learn against this identity. They need you to sign up for accounts. They want stable identifiers, like your phone number or email address.
Your private virtual assistant breaks this. When you first use a service, it will want you to register. Your assistant will do this for you.
It will create a unique token for this registration. It will automate the tedious work of setting up temporary or proxy records such as email addresses and phone numbers.
If the service needs a payment method, they can use Apple Pay or a virtual credit card.
You will never trust a remote system with your actual payment information or permanent contact information again.
Trust Can Be Revoked
Services might need to store additional information about you to personalize their service. This is a great benefit for users.
For example, Netflix watches what we watch so that it can make better recommendations. If Netflix wants to track this long-term, it should ask our virtual assistant to store this information in our private cloud for it. If they store it on Netflix, it will be associated with a temporary email address or ID that our virtual assistant generated.
At some point, either we or our assistant decide it is time to rotate our IDs. Our assistant does this because it is paranoid. It doesn’t know or trust what remote systems are tracking against those IDs. So it creates new ones.
Next time we return to Netflix, the assistant negotiates to create a new account. It tells Netflix that we are a previous user and we still have Netflix data that it can request; but only the data we decided to keep and share.
Some third parties will try to cheat and hide fingerprints or unique signatures that they could use to link your client-side data with your previous account.
When companies are caught doing this, Siri, or whatever virtual assistant you are using, will block the sneaky fields, scramble them, or maybe even remove the offender’s access to all client-side data requests.
Why Would Companies Allow This
They don’t really have a choice. Apple has already blocked advertisers from the most egregious forms of tracking on their devices and browsers.
All the technology is already in place on your iPhone: Hide My Email, Apple Pay, private storage, a protocol for remote systems to store data client side, and virtual assistants. What has been holding this back has been automation. Right now, we still rely on users being savvy enough and willing to do a lot of tedious work setting up and remembering things. That is what a virtual assistant is for!
Automation will make it easier for customers to do this, and thus demand it from the companies we get services from. Naturally, some businesses are so entwined with surveillance capitalism, that they won’t be able to switch. Some users don’t mind trading privacy for access to free content and services. But it will be an enforceable choice.
Instead of relying on legal frameworks, AI can put the ability to enforce privacy into the hands of users. We can make it simple, transparent, auditable, and enforceable. We can restore user trust in companies that are doing the right thing.
It doesn’t have to be Apple, but I think Apple can save privacy and the world.