I have a confession to make: I don’t actually hate Facebook. A social network that connects people and makes it easier to communicate is a nice idea in principle. And Facebook has already done some very good things for humanity. Just look at drug trial companies, for instance. They have used Facebook (FB) to find potential participants, slicing significant costs of this important endeavor.
Further, the #DeleteFacebook movement misses the point. Something will take its place. Facebook is only the latest generation of online social networks. More interesting and challenging than killing Facebook is fixing Facebook—and, by extension, fixing the entire attention economy across invasive online platforms.
The “attention economy,” broadly speaking, is the market for our attention span throughout the day. The Internet, television, social media, and radio all fight for our attention. Legacy mediums have long packaged and sold our attention to advertisers. But the advent of social media and search has ushered in much richer two-way transfers of information that have put our privacy at risk and also created secondary and tertiary markets for minute details of our online lives.
Here’s one idea on how to fix Facebook: Put a price on every single Facebook user and allow them to pay it to opt out of any tracking or any other activities or algorithmic interventions. That’s been suggested before, and Facebook COO Sheryl Sandberg even recently addressed it. But here’s the twist: The prices must vary by individual user and will reflect the user’s actual value to Facebook. Users in poor countries will be worth a lot less than users in rich countries.
And rich users will likely be worth more to Facebook than poor users, so they would have to pay more to use the service (we know that Facebook cares about income distribution because it has offered the capability to advertisers and marketers to target by inferred income). This would create a sliding scale that would somewhat mitigate the cost to users of lesser means. Your personal attention economy number would be unique to you. We know Facebook is already doing this internally, to some degree. The company calls it “average revenue per user.” In the U.S. and Canada, for example, a Facebook user is worth about $26 per quarter of revenue. Globally, users are worth $6.18 per quarter.
This would make the entire attention economy an explicit exchange of our attention for services that we are using with an easy opt-out.
Those who won’t pay but want to keep using the service will basically agree to give up their information and cede rights to privacy. It should even be worded that way—and in large type on a terms of service agreement. This will happen explicitly and in one fell swoop, rather than drip, drip, drip. And we can get rid of this Kabuki theater of Facebook really caring about user privacy when its entire business model to date has been premised on selling as much information about users as it can get away with.
This transparency will start to build trust and create a real marketplace for our attention. It would also work in tandem with other legislative efforts to enforce privacy, like Sen. Mark Warner (D-Va.) and Sen. Elizabeth Warren’s (D-Mass.) proposal to force companies to pay $50 or $100 for each person whose data is stolen. Likewise, it would complement proposals for better regulations and definitions of privacy in the U.S., as laid out by journalist and privacy expert Julia Angwin.
The important point, however, is changing the entire attention economy by making it an explicit market that we can see and understand instead of a murky exchange where the users are the product. In this manner, what I propose could actually work for companies like Equifax (EFX)—where services that surreptitiously collect information about us would have to receive our informed content and disclose our value whenever it sells our information. And it would build upon other ongoing Facebook efforts to make privacy controls easier to use.
Putting a transparent and personal price tag on Facebook privacy would also have the beautiful effect of making it clear whether people actually want Facebook (and other attention economy services). If no one wants to pay but they still want to keep using Facebook, then their intent becomes very obvious. People are voting with their wallets and saying that they don’t value their privacy very much. If people don’t want to pay and then stop using Facebook, it’s also obvious. They have said that the service is not worth the price of their privacy. If people want to pay and keep using Facebook sans tracking and ads, then that sends another message: that they value Facebook’s service but also value their privacy. In one stroke, we can create far better alignment between user needs and company goals. To make this transaction more accountable, once per year, Facebook (or any other service) will need to send to us a report of how our information was sold, who bought it, and for how much—and request an annual opt-in with the same caveats and large-print type.
We can take it one step further, as well, and mandate that Facebook have a button on every ad or every sponsored post that tells us who bought the post, where they are located, and how much they paid to put that post in front of our eyes. This might be shocking to some, mainly to see how little advertisers are paying to put things in front of us.
What applies to Facebook could easily apply to Google (GOOG), Twitter (TWTR), Snapchat (SNAP), Instagram, and other free social and search services. In every case, users should have the ability to know what their attention is worth.
To be clear, there are some gray areas that won’t fit perfectly into what I’m proposing.
In the case of Cambridge Analytica, for example, users were essentially tricked into willingly submitting their data to a third-party application. Internally, Facebook may want to ask paying users, for example, to turn on their facial recognition features in order to enable auto-tagging, and some users may like that idea. In this case, as well, government regulation should kick in and fine Facebook (just as it would fine other privacy violations that involve the failure to enforce privacy rules and terms of service).
As a general social principle, mandating transparency in transactions is a healthy policy—and it’s the right thing to do. Opacity hides unfairness and business models that are unethical. In sector after sector, this has been shown to be true. Unethical businesses seek to hide the value of a customer, a good, or a service from the end user. The situation is slightly flipped with social and search, where users are getting something for free but the ethics remain the same.
The bottom line is this: It is fundamentally impossible to build a business that puts users’ privacy first when the business model depends on selling private data about those users without making this practice obvious or explicit. Make everything explicit, obvious, and open—and keep it simple. This will pave the way for a healthier generation of attention economy businesses that will have far better alignment between their business models and their users’ interests.
Alex Salkever is an author, public speaker, and former vice president of marketing at Mozilla. He is also author of The Driver in the Driverless Car: How Our Technology Choices Will Create the Future.