Our first event at betaworks Studios.
On Monday April 23rd, a week before our April 30th launch, we opened the doors to betaworks Studios for a live beta test of the Studio. As you can see from the pictures below the space looks great. But more important is the ideas and that will be discussed in the space and the people discussing them. So we brought industry leaders and thinkers together for a few hours to discuss Facebook. Specifically, given the recent revelations around Cambridge Analytica, can Facebook be fixed, does it even need (or want) to be fixed? And some larger thoughts around government regulation and innovation.
Joining our betaworks Studios CEO, John Borthwick, on stage was moderator Andrew Keen, author of Fixing the Future, Jessi Hempel, Senior Writer at Wired, along with Roger McNamee, founding partner at Elevation Partners. Roger, despite being an early Facebook investor and the person that introduced Sheryl Sandberg to Mark Zuckerberg, was one of the first investors to raise concerns over Facebooks product and ethics. We covered a lot of ground and feel honored to have such knowledgeable and passionate panelist share their views with our new members.
Below are some thoughts that stuck out from the evening.
Andrew Keen: Why are we talking about Facebook anyway?
Jessi Hempel: After Cambridge Analytica, when a whistleblower exposed a data breach back in March, the conversation around Facebook has changed. Instead of the original narrative about the iconic, successful millennial — goes to college, starts a company in a dorm room, and it becomes wildly successful — we’re talking about what does it mean to own my data? Who gets to own my data? Who can I trust with my data? What does it mean to protect ourselves? Should we shut our Facebook accounts down?
“The sad thing is that not one thing that happened here wasn’t the result of a conscious decision.”
AK: Is anyone innocent?
Roger McNamee: When this began, I thought it was a set of innocent mistakes or the hubris of youthful entrepreneurs. The sad thing is that not one thing that happened here wasn’t the result of a conscious decision.
AK: How much of my data does Facebook have?
John Borthwick: Jessi referred to it as a data breach, but this was a data policy. This was an explicit policy for apps to access all of your data. Facebook has never sold data; it’s also true that they’ve given data to everyone who asked between 2010 and 2014.
To make it very personal, I downloaded my data. 260 apps had access to my data. My 2,500 friends and their phone numbers were shared with these apps; in addition, they’d shared my data with 680 advertisers, including the Obama campaign and Taylor Swift. One thing that struck me on the developer side was that 40% of these apps were actually defunct companies. If I was a nefarious actor, I could buy those apps and access that data.
AK: Many companies that have access to our data and are more sophisticated at using it — so what? Why is this different?
JB: First and foremost, Facebook has proven to be a horrible custodian of that data.
RM: They’re an anomaly in that they were the first product to take human emotion and profit from that on a daily basis. They turned an emotional graph into a business. In that sense, they have far more ability to more damage than anyone else.
“Historically, antitrust is the most pro-growth thing you can do.”
Surely regulation isn’t a friend of innovation, right?
RM: I do not think that antitrust is the same as regulation: regulation alters their behaviors; antitrust alters their structure.
Historically, antitrust is the most pro-growth thing you can do. Monopolies are horrible for innovation, economic growth and horrible for employment. If you’re pro-growth, the best thing you can do is to follow something like the AT&T consent decree of 1956, an antitrust measure in which AT&T agreed to box itself into regulated telephony and therefore not enter the computer industry. It turned over all of its patents for free license, and that included the transistor — and that literally lead to Silicon Valley. Simply, if you don’t have competition, you don’t have entrepreneurship.
Is more tech the solution?
JH: I’m not sure that new technology that’s created won’t create an entirely new problem. Regulation exists, I think, between who consumers who want a product but don’t really understand what its creating and the businesses that can’t trust their best instincts.
JB: We need to get beyond the techno-utopianism. We need to get beyond this idea that this particular technology, this shiny, new toy like blockchain will solve everything. These are not neutral tools, as entrepreneurs we’re shaping them; there needs to be agency and accountability for that. One of the thing that’s changed is the belief that every shiny toy an entrepreneur puts forth is inherently good and that needs to be tested now.
RM: I’m going to paraphrase Einstein who said, “It’s really hard to fix a problem using the same techniques that got you into it in the first place.”
“I think everyone should be able to own and take their data with them.”
AK: What’s one piece of regulation that you would implement and legislate?
RM: The fiduciary rule so that, like a doctor or like a lawyer, there is a legal, moral, ethical, financial responsibility for anyone holding data to protect the user.
JH: Data portability. I think everyone should be able to own and take their data with them.
JB: I’ll do antitrust then because I think that covers it all.
This conversation has been edited and condensed for clarity.