Facebook CEO Mark Zuckerberg said in his first interview on the massive Cambridge Analytica data scandal that has roiled the social-media network this week;
“I’m really sorry that this happened,”
Zuckerberg said this on CNN Wednesday night, hours after he and COO Sheryl Sandberg issued their first public statements on the matter. Cambridge Analytica inappropriately used 50 million Facebook user profiles to target ads in support of its clients, including Donald Trump’s successful 2016 election campaign.
In the interview, Zuckerberg says that he regrets taking Cambridge Analytica at its word, when it signed a legal document in 2015 certifying that it had deleted the information from those profiles — but it had not.
“I don’t know about you, but I’m used to when people legally certify that they are going to do something, that they do it. But I think this was clearly a mistake in retrospect,” Zuckerberg said.
He also said that he is willing to testify to Congress on the matter, though he would prefer to send someone with “the most knowledge” on any lines of questioning. Furthermore, he says he’s open to the idea of regulating tech, and is in favor of legislation around transparency in online advertising.
Cambridge Analytica is being scrutinized for the methods it used during the 2016 presidential election, after executives with the firm boasted about their ability to covertly target voters, entrap politicians, and launch propaganda campaigns.
The reach of those operations was multiplied by connected platforms like Facebook. Russian operatives capitalized on this to a significant degree, sowing political discord among likely voters in a wide-ranging effort to meddle in the US election. Zuckerberg has acknowledged this in fits and starts, after initially balking at the idea last year.
He expressed some lingering disbelief of the concept on Wednesday night:
“If you’d told me in 2004 when I was getting started with Facebook that a big part of my responsibility today would be to help protect the integrity of elections against interference by other governments, I wouldn’t have really believed that that was gonna be something that I would have to work on 14 years later,” the Facebook CEO said.
“But we’re here now, and we’re gonna make sure we’re going to do a good job at it,” he added.
And while he says that “I’m sure someone’s trying” to influence the 2018 midterm elections, he’s sure that the company is better prepared to meet the challenge.
READ THE PRESS RELEASE ON THE SCANDAL BELOW
Cracking Down on Platform Abuse
Protecting people’s information is the most important thing we do at Facebook. What happened with Cambridge Analytica was a breach of Facebook’s trust. More importantly, it was a breach of the trust people place in Facebook to protect their data when they share it. As Mark Zuckerberg explained in his post, we are announcing some important steps for the future of our platform. These steps involve taking action on potential past abuse and putting stronger protections in place to prevent future abuse.
People use Facebook to connect with friends and others using all kinds of apps. Facebook’s platform helped make apps social — so your calendar could show your friends’ birthdays, for instance. To do this, we allowed people to log into apps and share who their friends were and some information about them.
As people used the Facebook platform in new ways, we strengthened the rules. We required that developers get people’s permission before they access the data needed to run their apps – for instance, a photo sharing app has to get specific permission from you to access your photos. Over the years we’ve introduced more guardrails, including in 2014, when we began reviewing apps that request certain data before they could launch, and introducing more granular controls for people to decide what information to share with apps. These actions would prevent any app like Aleksandr Kogan’s from being able to access so much data today.
Even with these changes, we’ve seen abuse of our platform and the misuse of people’s data, and we know we need to do more. We have a responsibility to everyone who uses Facebook to make sure their privacy is protected. That’s why we’re making changes to prevent abuse. We’re going to set a higher standard for how developers build on Facebook, what people should expect from them, and, most importantly, from us. We will:
Review our platform. We will investigate all apps that had access to large amounts of information before we changed our platform in 2014 to reduce data access, and we will conduct a full audit of any app with suspicious activity. If we find developers that misused personally identifiable information, we will ban them from our platform.
Tell people about data misuse. We will tell people affected by apps that have misused their data. This includes building a way for people to know if their data might have been accessed via “thisisyourdigitallife.” Moving forward, if we remove an app for misusing data, we will tell everyone who used it.
Turn off access for unused apps. If someone hasn’t used an app within the last three months, we will turn off the app’s access to their information.
Restrict Facebook Login data.We are changing Login, so that in the next version, we will reduce the data that an app can request without app review to include only name, profile photo and email address. Requesting any other data will require our approval.
Encourage people to manage the apps they use. We already show people what apps their accounts are connected to and control what data they’ve permitted those apps to use. Going forward, we’re going to make these choices more prominent and easier to manage.
Reward people who find vulnerabilities. In the coming weeks we will expand Facebook’s bug bounty program so that people can also report to us if they find misuses of data by app developers.
There’s more work to do, and we’ll be sharing details in the coming weeks about additional steps we’re taking to put people more in control of their data. Some of these updates were already in the works, and some are related to new data protection laws coming into effect in the EU. This week’s events have accelerated our efforts, and these changes will be the first of many we plan to roll out to protect people’s information and make our platform safer.