“The future is private,” is a message most of us wouldn’t expect to hear from the world’s data-hungry and security-embattled social network behemoth.
Yet, privacy and its future in the context of Facebook’s near three-billion-person community was overwhelmingly central in Mark Zuckerberg’s keynote address at the annual F8 developer conference. Held in late-April, F8 was the platform where the enigmatic young founder delivered his big-picture intentions for the 15-year old Facebook – default end-to-end encryption, secure payments and private interactions being just some of the language Zuckerberg used to clarify the new order in a community of 2.7 billion users, most of whom log in every month.
“For the last 15 years, we’ve built Facebook and Instagram into digital equivalents of the town square, where you can interact with lots of people at once. Now we’re focused on building the digital equivalent of the living room, where you can interact in all the ways you’d want privately — from messaging and stories to secure payments and more.” – Mark Zuckerberg, F8 2019
Critics argue that Facebook’s new stance on privacy is merely in response to the data leaks and subsequent scandals which have rocked the company for over a year now. Notwithstanding the risk of scaring away investors, the ensuing privacy and security reputation issues Facebook has had to deal with as a result of being a ruthless collector of your data have undoubtedly spurred the new vision.
Some industry commentators, this writer included, find Zuckerberg’s “the future is private” statement to hold more news value than the new features – some useful social tools, others superficial novelties – also unveiled at F8.
Facebook’s privacy focus: the probable dark side
On the opposite end of the spectrum, the call to address peoples’ growing dependence on technology and superfluous social networks sounds louder. One Silicon Valley insider recently pointed out Facebook’s privacy-focused social network may not be the answer to much-needed reform in the attention economy as total encryption, private interactions and interoperability could facilitate all kinds of dangerous online behaviour.
Tristan Harris, a former Google design ethicist and co-founder of the Center for Humane Technology, advocates for a humane agenda for tech, highlighting the worrying loss of civility social networks have given rise to; an issue he says “is bigger than screen time”. In a recent interview, he emphasised the extraction of human attention, or, “the race to the bottom of the brainstem” where human faculties – attention spans and mental health – are effectively downgraded as machines are upgraded.
Against the background of Facebook’s pivot to a privacy-focused social platform, Harris expressed concern for end-to-end encryption as it is likely to absolve the social media giant of any liability when misuse and criminal actions take place behind the iron curtain of an increasingly fragmented and incognito community — in Zuckerberg’s words “the digital equivalent of the living room”
“Once content is encrypted (in private groups for example) they [Facebook] don’t have to be responsible for disclosing criminal activity to authorities,” said Harris.
Just as the curtains drew on Facebook’s F8 event, Google was preparing for its own developer gig called Google I/O 2019. Like Facebook, the search giant had some bumper feature announcements balanced with addressing privacy-related concerns. As the granddaddy of wholesale data collection, Google is all too familiar with the fine line between being the number one provider of digital products and the trust issues that come with having a terrifyingly intimate understanding of its users. For example, the deep learning data that has made Google Assistant into the sophisticated machine it is, is now stored locally on Android devices (instead of being stored in the cloud). At I/O, Google made sure to point this out; at the very least it should offer some reassurance and ease the scepticism of privacy-conscious users.
At Facebook, in addition to leading product development, the privacy vision will become part of company culture which Zuckerberg says will include, “consulting with experts on the major trade-offs and social issues to find the best path forward, taking a more active role in making sure developers use our tools in good ways, and building out the technical infrastructure to support this vision.”
The privacy vs. illicit use conundrum
While being less invasive is what governments, mainstream media and the public around the world have been calling for in the wake of broken trust, Facebook will need to work hard to build a non-exploitative experience to support its privacy mission while actively protecting users at this pivotal point. To safeguard the connected population, Facebook will need to design and build products compassionately and in a way that enables thoughtful and substantive connections while protecting our most vulnerable human instincts.
Fortunately, strong voices such as the Center for Humane Technology are speaking out against the unchecked power of big tech firms: “We envision a world where Humane Technology is the default for all technology products and services. A combination of new design processes, new goals and metrics, new organisational structures, and new business models would drastically reduce harmful externalities, actively supporting our individual and collective well-being.”
If Facebook and Google can deliver on their renewed views of privacy and their products’ purpose in society, social media and the Internet could be a fundamentally different experience from what it is today. I for one hope it will be a future that is private and made-up of digital products and services which favour humanity’s best interests.