There is an abiding dream in the tech world that when all the planet’s people and data are connected it will be a better place. That may prove true. But getting there is turning into a nightmare – a world where billions of people are connected but without sufficient legal structures, security protections or moral muscles among companies and users to handle all these connections without abuse.
Lately, it feels as if we’re all connected but no one’s in charge.
Equifax, the credit reporting bureau, became brilliant at vacuuming up all your personal credit data – without your permission – and selling it to companies that wanted to lend you money. But it was so lax in securing that data that it failed to install simple software security fixes, leaving a hole for hackers to get the Social Security numbers and other personal information of some 146 million Americans, or nearly half the country.
But don’t worry, Equifax ousted its CEO, Richard Smith, with “a payday worth as much as $90 million – or roughly 63 cents for every customer whose data was potentially exposed in its recent security breach,” Fortune reported. That will teach him!
Smith and his board should be in jail. I’m with Sen. Elizabeth Warren, who told CNBC, “So long as there is no personal responsibility when these big companies breach consumers’ trust, let their data get stolen, cheat their consumers … then nothing is going to change.”
Facebook, Google and Twitter are different animals in my mind. Twitter has enabled more people than ever to participate in the global conversation; Facebook has enabled more people than ever to connect and build communities; Google has enabled everyone to find things like never before.
Those are all good things. But the three companies are also businesses, and the last election suggests they’ve all connected more people than they can manage and they’ve been naive about how many bad guys were abusing their platforms.
As Mark Warner, the top Democrat on the Senate Intelligence Committee, put it to me, “Up to now these companies have not taken the threat that Russia and other foreign agents pose to our system seriously enough or invested enough or to really reveal what happened in 2016 – or what is still happening now.”
In November last year, Facebook CEO Mark Zuckerberg dismissed as “a pretty crazy idea” evidence that people were using Facebook to generate fake news to tip the U.S. election. Last week, after disclosing hundreds of Russia-linked accounts – where fictional people posing as U.S. activists spread inflammatory messages about immigration and guns and trashed Hillary Clinton and boosted Donald Trump – Zuckerberg admitted, “Calling that crazy was dismissive and I regret it.”
One reason Facebook was slow to respond is that its business model was to absorb all of the readers of the mainstream media newspapers and magazines and to absorb all their advertisers – but as few of their editors as possible. An editor is a human being you have to pay to bring editorial judgment to content on your website, to make sure things are accurate and to correct them if they’re not. Social networks preferred to use algorithms instead, but these are easily gamed.
America’s democracy is built on two principles: truth and trust. We trust that our elections are fair and that enables our peaceful rotations of power. And we trust that the news we get from our mainstream outlets is true and that it is corrected if it is not. And we expect our president to defend both. But today many people are getting news from platforms that are easily polluted by Russian or other hackers with fake news. And our president is a liar who refuses to hold Russia to account for anything. It’s a terrible combination.
We can’t fix Trump right now. But have Equifax and these big social networks become so much part of the wiring of our lives – and the effects of their failures so consequential – that they should be regulated in new ways? I don’t know, but I know it’s time for this discussion. It’s already started.
These companies make billions selling our data, but they’re ambivalent about taking responsibility “for the uses, and abuses, of their platforms,” argued Harvard political philosopher Michael Sandel. “They can’t have it both ways. If they claim they are neutral pipes and wires, like the phone company or the electric company, they should be regulated as public utilities. But if, on the other hand, they want to claim the freedoms associated with news media, they can’t deny responsibility for promulgating fake news.”
In the early 20th century, Sandel added, “the rise of monopolies and concentrated economic power brought forth an era of progressive reform that regulated railroads, banks and utilities in the public interest. Today, we need a similar spirit of reform. These platforms are so dominant that, like electric wires or telephone lines, we can scarcely avoid using them. But when they allow our personal data – or elections – to be hacked, there’s not much we can do about it.”
“A century ago, we found ways to rein in the unaccountable power associated with the Industrial Revolution,” Sandel concluded. “Today, we need to figure out how to rein in the unaccountable power associated with the digital revolution.”