It's kinda ironic that a post about "All we are doing is connecting people and information" gets deleted because it gets connected to a lot of people ("leaked").
You can't simultaneously hold the opinion that there's no harm in sharing information, while also holding the opinion that there is such thing as a "leak." It just strikes me as so ironic for a company that champions "privacy is dead, live with it," to have to delete its own valid internal debates because of the consequences of lack of privacy (i.e. leak = lack of privacy).
You may find it additionally ironic that about a month before the Boz memo, it was reported[1] that Zuck bought 4 houses located around his own for $30 million.
Privacy is looking pretty alive in that neighborhood.
If it wasn't for the unearned and unnassailable value of the network effect Facebook benefits from, this disgusting behavior (from Zuck and his minions) would be enough to drive everyone to a new platform.
"The more secretive or unjust an organization is, the more leaks induce fear and paranoia in its leadership and planning coterie."
"This must result in minimization of efficient internal communications mechanisms (an increase in cognitive
"secrecy tax") and consequent system-wide cognitive decline resulting in decreased ability to hold onto power as the environment demands adaption."
"Hence in a world where leaking is easy, secretive or unjust systems are hit nonlinearly relative to open, just systems. Since unjust systems, by their nature induce opponents, and in many places barely have the upper hand, mass leaking leaves them exquisitely vulnerable to those who seek to replace them with more open forms of governance."
Yeah I feel like secretive and unjust are two completely different things, and Julian Assange conflating the two doesn't reflect well on his already questionable character.
Systems that are just do not generally require secrecy to the degree that unjust systems do, precisely due to the fact that when publicized, people will generally agree with just processes.
That's why I said "to the same degree", i.e. privacy is important, but you don't have to have the massive opacity required for large-scale unfairness to persist.
> You can't simultaneously hold the opinion that there's no harm in sharing information, while also holding the opinion that there is such thing as a "leak.
Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post. In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment. Maybe that would lead to more collaboration on solutions, which is necessary because there are actually some tricky tradeoffs here. But that doesn't give the same dopamine hit as cutting down the tall poppies, right?
I guess I don't understand your point. But I'll elaborate because I'm interested in this topic.
My point is that Facebook's ethos, that a post-privacy era can exist and be okay, is betrayed by how they clamor when it's their own privacy. It makes it feel like it's a one-directional relationship.
Perhaps you're suggesting facebook's ethos is actually that a privacy middle-ground can exist, where people can choose what gets shown where and to whom. I can't disprove that that's their ideal, but in light of their data-sharing I have no reason to believe it.
Aside from ideals, we can point out the consequences of Facebook in practice. Facebook is a written medium that preserves everything (even something from 2 years ago) and thus has constructed a system that forces its users to hyper-curate their entire public persona or suffer social consequences, and from a practical perspective their own VP failed to curate sufficiently. So regardless of ideals, if the system punishes discussion then I see that as a problem, as well as an irony when it happens to their own VP.
> Perhaps you're suggesting facebook's ethos is actually that a privacy middle-ground can exist, where people can choose what gets shown where and to whom.
They can, to a larger degree than most critics seem to realize. I use those controls all the time. I might agree that they're not as prominent or easy to use as they should be, but they exist because people at Facebook cared enough to implement them (which isn't easy or cheap at that scale BTW).
> I can't disprove that that's their ideal, but in light of their data-sharing I have no reason to believe it.
Oh, you mean the data sharing that was curtailed sharply in 2014, and again in increments ever since? Is that continuing effort and foregone revenue "no reason" to believe such a sentiment exists?
> if the system punishes discussion then I see that as a problem
Yes, punishing discussion is a problem. Wait, what is this entire thread, and a thousand like it all spawned by the publication of these posts, about?
>> Oh, you mean the data sharing that was curtailed sharply in 2014, and again in increments ever since?
Don't pretend they're taking care of it on their own. They're reading text messages in 2018.
>> Yes, punishing discussion is a problem. Wait, what is this entire thread, and a thousand like it all spawned by the publication of these posts, about?
Wait... Are you arguing that it's not bad that facebook stifles controversial opinions on its platform because its behavior creates controversial discussions on other websites...?
> Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post.
There's a power asymmetry here that makes the individual user vulnerable, and that power asymmetry should be countered by demanding transparency.
There's an obvious parallel here between individual citizens and government apparatus.
Those that control infrastructure and institutions shouldn't be enabled to abuse that power. And if they do they shouldn't be surprised when the affected protest!
>> Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post.
This argument is illogical, because Facebook forces everyone to sign its ToS to use its services, while nobody forces a Facebook employee to leak internal stuff. Said another way, whether or not I wish to have control over my FB data, FB coerces me to agree that it can do whatever it wants with my data. Its not exactly opt-in, is it? Its far worse, of course, if you consider shadow profiles, because it is even coercing people who didn't even explicitly sign up to the ToS. Unless the leak happened via some kind of coercion (which doesn't seem to be the case), your comment is incorrect.
>> In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment.
What? You mean you care about something, but you just won't do something about it, nor openly tell anyone why you wouldn't do something about it, or even talk about it before the issue blows up? Yep, totally convincing.
>> Maybe that would lead to more collaboration on solutions,
Why do people need to "collaborate" on solutions? What do they get from it? Is Facebook going to pay people a share of the profits? If Facebook is a corporate entity which serves its self-interest against people's self-interest (which they have clearly been doing for a long time), what kind of idiot would suggest the people whose self-interest has been affected should now "come to the table" so "we can all work something out"?
>> which is necessary because there are actually some tricky tradeoffs here.
The only tricky tradeoff here is: should Mark Zuckerberg be the only one who should go to jail, or should the entire company be rounded up? It is quite tricky, I do agree.
>> But that doesn't give the same dopamine hit as cutting down the tall poppies, right?
I don't know about tall poppies, but "culling" the "weeds" is the only way to have a healthy garden.
You're free not to use it. If that opt-in isn't enough, exactly how many levels do you want? If you do choose to use a free service, whether it's Facebook or a public library, you have to consider how it's paid for. Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.
> You mean you care about something, but you just won't do something about it
You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.
> Why do people need to "collaborate" on solutions? What do they get from it?
Ummm ... the solutions, which are not only applicable to Facebook? This is a general problem faced by many companies. The solutions could also be useful to the people who blather about creating a distributed alternative to Facebook. I've been a member of the decentralization and distributed-system community for far longer than Facebook or Y Combinator have existed. I also know something about the scale and connectedness of the data at Facebook. We're multiple basic innovations away from being able to create such an alternative. Wouldn't it be nice if people who actually understand various parts of this can talk and work together? That doesn't become more likely when every discussion is filled with people who only read others' comments enough to find where to insert their own half-baked opinions or insults.
>> If that opt-in isn't enough, exactly how many levels do you want?
Since you can't seem to count to 2, how about:
1. You let us share your data with others in return for free service
2. You don't let us share your date in return for paid service
>> If you do choose to use a free service, whether it's Facebook or a public library
Well, a public library is tax funded and people outside the library employees have a big say in its inner workings. So you can't get your comparisons correct either.
>> Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.
Perhaps you should complete the thought, because I don't actively use the something
>> You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.
Really, as opposed to your very realistic expectations that everyone should just trust FB employees would have "done the right thing" had they not been caught red-handed? Oh right, because FB knows better what is best for everyone else.
>>Wouldn't it be nice if people who actually understand various parts of this can talk and work together?
This is truly bizarre. So if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt? Let us say you think, "oh, but it might take much longer". Does that automatically adversely affect people more than the damages that can be caused to society via rampant data collection? How can you be so sure? Oh wait, because you must be smarter than everyone else, as you got through the interview.
And finally, it is interesting all the things that you selectively left unsaid (exactly like other FB employees have been doing all the while).
- you don't have the courage (what an ironic handle) to discuss shadow profiles
- you never actually addressed the fact that no one from outside coerced the leak, which made your first comment more rhetorical than substantial
- you cleverly twisted the "collaboration" to be amongst FB employees when clearly the line following tells that you actually meant collaboration between FB employees and its users (dopamine hit for whom, that is? so you are now assuming others cannot read either?)
Bystander here. Why the ban? It’s snarky in places for sure but I’d say it’s a pretty solid set of points and counter points. It definitely “added something” to my experience reading this thread.
I suggest that you also identify the primary account behind it and give them a reminder too, or else they'll just keep doing it over and over again until their targets run out of patience.
> 2. You don't let us share your date in return for paid service
Personally, I think that might be a good option, but you can't claim to have made it explicit before so your "count to 2" insult is misplaced. I know that the only thing you've ever done since your account was created is bash Facebook (how nice that anyone can check that for themselves BTW), but even in that light such childishness is counterproductive.
> if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt
Total strawman. Nobody said or implied that. There's plenty of knowledge and innovation everywhere, but the amount that can come from Facebook only has to be non-zero to support my point. Several hundred developers who have collectively worked on almost every distributed system you've ever heard of might have an idea or two worth discussing. They might even have a perspective on scaling issues that's highly relevant to the problem at hand but not widely known outside of Facebook and maybe three other companies. Why do you try so hard to throw cold water on any such conversations?
You were teetering on incivility earlier in the thread and here you fell straight into it. Please don't! Instead, please read https://news.ycombinator.com/newsguidelines.html and follow the rules regardless of how badly anyone else is behaving.
You can't simultaneously hold the opinion that there's no harm in sharing information, while also holding the opinion that there is such thing as a "leak." It just strikes me as so ironic for a company that champions "privacy is dead, live with it," to have to delete its own valid internal debates because of the consequences of lack of privacy (i.e. leak = lack of privacy).