Barbara Howard: Facebook CEO Mark Zuckerberg, in the hot seat today and tomorrow, testifying before Congress. Zuckerberg, being asked about Facebook's role in influencing elections, as Facebook users were targeted with "fake news." It's now known that Facebook user data was surreptitiously collected by the London-based firm Cambridge Analytica, which worked with the Trump campaign in the 2016 election. An apologetic Zuckerberg has pledged to fix Facebook. He's enlisted the help of social scientists like Professor Gary King, who runs Harvard's Institute for Quantitative Social Science. He's with us on the line. Thanks for joining us, Professor King.
King: Thanks very much. Great to be here.
Howard: So this initiative to fund and produce research examining Facebook's role in politics, it's based on your work. What kind of information is the research going to be looking at?
King: Well, Facebook is going to let us look at the effect of social media on democracy and elections. And there's quite a lot of questions in there. We hope that the specific model that we came up with that made it possible for a private company to share quite a lot of information with academics pursuing social good, instead of or in addition to the company pursuing its own missions. The social scientists are trying to tackle some of the greatest challenges that affect society, and some of the largest amounts of data in the world about human societies are actually inside private companies. So being able to develop this model that made it possible for industry and academia to work together will wind up benefiting everybody.
Howard: What are some of the major ways Facebook's been part of election outcomes here, with the Trump election ... and Brexit also comes to mind?
King: Yes, well, we of course don't know what the effect of social media was on these elections, but we certainly know that politicians and many others advertise on social media and try to influence other people on social media.
Howard: But we're talking about something completely different, not just straight up advertising, which most people recognize. We're talking about other things, aren't we?
King: Oh sure. We're talking about misinformation and people claiming they are one person and actually being another person and making up a completely fake news outlet and advertising that, absolutely. We're trying to figure out and identify all these consequences.
Howard: Well Facebook’s now dealing with the fallout from Cambridge Analytica siphoning data from its users. But under this research plan, all kinds of Facebook data will be made accessible to academics. Is that something we should be worried about?
King: No, absolutely not, because there are very clear and very accurate methods that make it impossible for anybody to leak information. This is what should have been done originally with Cambridge Analytica and others. Academics are very carefully vetted. We’ll have access to locked down computer systems where every single key that they type is completely audited. So their privacy is not going to be protected in what they're doing — everybody is going to know — but the privacy of individuals who used Facebook are certainly going to be protected. The great thing is that all of this information could really be used for the public good, to find out the effect of social media on elections, but perhaps a vast array of other issues as well.
Howard: Well, Facebook's not exactly trusted, I think it's fair to say right now. So who's to say that the company won't be selective or manipulative when it comes to the fruits of this research that you're going to be doing to make itself look better?
King: So, this is the great thing about the particular model that we set up. It's not possible for Facebook to stop us from publishing, because that’s the agreement. The agreement is that once we put out a request for proposals, and we award a grant funded not by Facebook, by the way, but funded by a coalition of politically and ideologically diverse non-profit foundations — when outside independent experts have the data, they do their analysis and they decide to publish, Facebook has no review possibility. So this is an extraordinarily unusual situation where almost no companies allow this kind of thing, or at least it's very, very rare. Certainly Facebook hasn't done it before. And so Facebook doesn't have the opportunity to do that. Moreover, the foundations themselves are providing the money. They're not actually deciding who gets the money. We're deciding who gets the money. The academics are deciding who gets the money, nobody else — not Facebook, not the foundations, nobody.
Howard: Well, the bigger question is this, though: Can Facebook ever be reined in, with so many users, so much going on across the whole platform? Is there a chance that this research will be for naught?
King: Well, we'll certainly find things out. We'll find things out about what Facebook has done, and mostly what Facebook might do and what kind of effects, not only Facebook's, but social media and the Internet in general, has on the public and on elections and on democracy.
I think we need to know that. I think it's really important that we know that. If we know that, then we might be able to improve things.
Howard: Ok, thanks for joining us Professor King.
King: Thanks for having me. I appreciate it.
Howard: That's Professor Gary King who runs Harvard University's Institute for Quantitative Social Science, and his work is the basis for a new initiative to study how Facebook is used, and perhaps misused, when it comes to elections. The company's CEO, Mark Zuckerberg, testifying before Congress today and tomorrow.