Home Media Interview with Former Diaspora CEO Yosem Companys; Featured in “More Awesome than...

Interview with Former Diaspora CEO Yosem Companys; Featured in “More Awesome than Money” (Part 3)

1

I just finished reading the fascinating new book More Awesome Than Money: Four Boys and Their Heroic Quest to Save Your Privacy from Facebook by Pulitzer Prize winner and New York Times reporter Jim Dwyer. The Wall Street Journal describes the book as the efforts of “four idealists frustrated with Facebook’s control over our personal data…to create an alternative,” and why they didn’t ultimately succeed. Other than being a fascinating story, with drama and even tragedy (specifically, the suicide of brilliant, charismatic co-founder Ilya Zhitomirskiy), the book covers important issues facing all of us in the age of social media, the “cloud,” etc.: privacy, the digital “Panopticon,” the profit motive vs. creating something socially beneficial, how promising technologies do or don’t end up getting funding to move forward, implications for society, even human identity itself. I make absolutely no pretensions to being an expert on any of this, just someone interested in the subject. So, I asked my friend Yosem Companys — who teaches high-technology entrepreneurship at Stanford University, runs social media for Stanford’s Program on Liberation Technology, and previously worked as consiglieri and CEO of Diaspora (with a crucial role to play in “More Awesome than Money”) – whether he would be willing to answer a few questions. He graciously agreed. Here’s the interview, edited for conciseness and clarity. Note: I’ve decided, due to the interview’s length (16 questions and answers), to break it up into four parts. The first four questions and answers are available here, the second four questions and answers here. Now, here are #9-#12.

Question #9: There are lots of issues here. It seems that what these Diaspora founders undertook was daunting – to put it mildly.

Yosem Companys: Yeah, I mean, people forget that the guys were just kids, with Max, the oldest, about 22 at the time they started Diaspora. And, people also forget that the guys set out to create a summer project, then ended up with a business venture they did not ask for initially. Basically, the New York Times covered their efforts, and there was a huge pent-up need for the service, so they accidentally raised a ton of money on Kickstarter. But it’s incredibly complicated to put together a successful startup in this space, because Facebook is a large company with deep pockets, and, to succeed, one must successfully address a wide range of organizational and technical issues.

But I truly believe that, to build a new site that fixes the problems with privacy, you need to build a decentralized solution that uses a server-to-server open-source model and employs a privacy business-model. Even better, figure out how to make it encrypted without compromising speed. And, then, to grow it, you need to overcome the network effect either by building a killer app (an extremely difficult task, as you need a lot of luck to strike gold) or by building a HootSuite-like capability that allows people to remain in touch with their family and friends on other sites from your software. But the revolutionary part of all this is the decentralization, because that is what creates the technical means for users to control their data. Anything else that promises to be private, but that uses the old centralized approach, will simply recreate the problem that Facebook has, because the only difference between that and Facebook is a vague promise to privacy.

Question #10: What does the Diaspora experience tells us about privacy in the digital, social age we live in?

Yosem Companys: Unfortunately, it tells us that, until Snowden, there was not a whole lot of interest among investors to fix the privacy problem. So we were — and, for all effects and purposes, still are — living in a panopticon. Until we start building sites as I suggested above, or find new ways to do so such as adopting Doc Searls’ approach to vendor-relationship management , any new site that is built will simply reinforce the panopticon, because someone could spy on you, whether that someone is the site’s programmers, the government, malicious hackers, or another party. You will never be fully protected, of course, as control and encryption do not prevent infiltration and human error, such as your inadvertently revealing your password to someone else. But, if we started building sites differently, we would be much better off.

One interesting thing we have not talked about is that geographic location matters, as the government can throw you in jail if you refuse to hand over your users’ data, whereas some locations are more privacy friendly (for example, Switzerland or Iceland — see here for instance). Germany has also legally enshrined a degree of protection for privacy. Culturally speaking, in general, Europeans have a greater privacy sensitivity than Americans.

Question #11: After reading the book, I came across articles, like ISIS Finally Finds a Social Network That Can’t Suspend Their Accounts, about terrorists using Diaspora.  I don’t know about you, but I find that disturbing. The fact is, technology – whether centralized, decentralized, top-down, bottom-up, whatever – can be used by humans for good or ill. But would you say that the very nature of Diaspora opens it up more to truly bad actors – ISIS, criminal enterprises of all sorts – than a network with “gatekeepers” like Facebook?

Yosem Companys: By being decentralized and an open-source software that anyone could use, there’s no question that Diaspora opens itself up to all sorts of bad actors using the site, including terrorists. But there is nothing Diaspora can do about that, because the Jihadists are using Diaspora’s free software on their own servers, and Diaspora has no control over who uses their software and on what server. That, in and of itself, shows you the disruptive potential of Diaspora’s model.

It may be disturbing, but it’s the way our world works. Technologies are just tools that can be used for good or ill, depending on the values of their designers and users. For example, Ford makes and sells cars to everybody, and a terrorist could buy one and strap a bomb to it to conduct a terrorist attack. But that’s not Ford’s fault. It’s up to law-enforcement agencies to track down the bad guys. And there are a number of instruments, such as warrants, that law-enforcement agencies can use to get access to personal data when they suspect users are doing bad things. In society, we need both privacy and security. The idea that they are in opposition to each other is wrong. But finding the ideal way to preserve both is tricky.

Question #12: Do you believe the Diaspora experience tells us anything about human identity in the digital/social age we live in?

Yosem Companys: One of the most interesting things we observed among Diaspora’s users in terms of identity was that they preferred to craft their own. Unfortunately, writing computer code entails creating a set of abstract instructions that spell out all the things that users can do on a given site. These design decisions also necessarily restrict what users can do, because what users can do conflicts with other things they want to do, or because the designer could not imagine or was simply not aware that the user wanted to do it, or because giving too many user options would overwhelm the user. Thus, almost by definition, your identity can only be that which the developers conceive of and code for you.

One simple example will drive the point home: When we were working on Diaspora, all major sites we knew of required that you fill out the gender field as a dichotomous menu of either male or female. Thanks to Sarah Mei, we left the gender field blank, which became a huge hit in the LGBT community and made us look quite progressive, something for which we were both criticized and lauded.

The point here is that designers rarely spend time thinking about how their own values either strategically or inadvertently influence how users are allowed to express their identities online. That is something that MySpace did quite well when they accidentally allowed users to select their own wallpaper, something that became wildly popular with users but that the developers later explained was as a result of a glitch in the code and debated extensively whether to keep it. Also, there is an extensive literature in a number of fields (e.g., sociology of technology, history of technology, communication science, information science, and human-computer interaction) on how design values influence user behavior, but most developers are rarely exposed to this literature, so they are unaware of how their values and programming practices influence their code.

Instead, they typically start with the mathematical assumption that designing a “beautiful” algorithm reveals some objective truth about the world, forgetting that they themselves are subjectively deciding what that algorithm should look like and what users are allowed to do or not to do.

********************************************************


Sign up for the Blue Virginia weekly newsletter