>> Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post.
This argument is illogical, because Facebook forces everyone to sign its ToS to use its services, while nobody forces a Facebook employee to leak internal stuff. Said another way, whether or not I wish to have control over my FB data, FB coerces me to agree that it can do whatever it wants with my data. Its not exactly opt-in, is it? Its far worse, of course, if you consider shadow profiles, because it is even coercing people who didn't even explicitly sign up to the ToS. Unless the leak happened via some kind of coercion (which doesn't seem to be the case), your comment is incorrect.
>> In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment.
What? You mean you care about something, but you just won't do something about it, nor openly tell anyone why you wouldn't do something about it, or even talk about it before the issue blows up? Yep, totally convincing.
>> Maybe that would lead to more collaboration on solutions,
Why do people need to "collaborate" on solutions? What do they get from it? Is Facebook going to pay people a share of the profits? If Facebook is a corporate entity which serves its self-interest against people's self-interest (which they have clearly been doing for a long time), what kind of idiot would suggest the people whose self-interest has been affected should now "come to the table" so "we can all work something out"?
>> which is necessary because there are actually some tricky tradeoffs here.
The only tricky tradeoff here is: should Mark Zuckerberg be the only one who should go to jail, or should the entire company be rounded up? It is quite tricky, I do agree.
>> But that doesn't give the same dopamine hit as cutting down the tall poppies, right?
I don't know about tall poppies, but "culling" the "weeds" is the only way to have a healthy garden.
You're free not to use it. If that opt-in isn't enough, exactly how many levels do you want? If you do choose to use a free service, whether it's Facebook or a public library, you have to consider how it's paid for. Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.
> You mean you care about something, but you just won't do something about it
You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.
> Why do people need to "collaborate" on solutions? What do they get from it?
Ummm ... the solutions, which are not only applicable to Facebook? This is a general problem faced by many companies. The solutions could also be useful to the people who blather about creating a distributed alternative to Facebook. I've been a member of the decentralization and distributed-system community for far longer than Facebook or Y Combinator have existed. I also know something about the scale and connectedness of the data at Facebook. We're multiple basic innovations away from being able to create such an alternative. Wouldn't it be nice if people who actually understand various parts of this can talk and work together? That doesn't become more likely when every discussion is filled with people who only read others' comments enough to find where to insert their own half-baked opinions or insults.
>> If that opt-in isn't enough, exactly how many levels do you want?
Since you can't seem to count to 2, how about:
1. You let us share your data with others in return for free service
2. You don't let us share your date in return for paid service
>> If you do choose to use a free service, whether it's Facebook or a public library
Well, a public library is tax funded and people outside the library employees have a big say in its inner workings. So you can't get your comparisons correct either.
>> Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.
Perhaps you should complete the thought, because I don't actively use the something
>> You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.
Really, as opposed to your very realistic expectations that everyone should just trust FB employees would have "done the right thing" had they not been caught red-handed? Oh right, because FB knows better what is best for everyone else.
>>Wouldn't it be nice if people who actually understand various parts of this can talk and work together?
This is truly bizarre. So if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt? Let us say you think, "oh, but it might take much longer". Does that automatically adversely affect people more than the damages that can be caused to society via rampant data collection? How can you be so sure? Oh wait, because you must be smarter than everyone else, as you got through the interview.
And finally, it is interesting all the things that you selectively left unsaid (exactly like other FB employees have been doing all the while).
- you don't have the courage (what an ironic handle) to discuss shadow profiles
- you never actually addressed the fact that no one from outside coerced the leak, which made your first comment more rhetorical than substantial
- you cleverly twisted the "collaboration" to be amongst FB employees when clearly the line following tells that you actually meant collaboration between FB employees and its users (dopamine hit for whom, that is? so you are now assuming others cannot read either?)
Bystander here. Why the ban? It’s snarky in places for sure but I’d say it’s a pretty solid set of points and counter points. It definitely “added something” to my experience reading this thread.
I suggest that you also identify the primary account behind it and give them a reminder too, or else they'll just keep doing it over and over again until their targets run out of patience.
> 2. You don't let us share your date in return for paid service
Personally, I think that might be a good option, but you can't claim to have made it explicit before so your "count to 2" insult is misplaced. I know that the only thing you've ever done since your account was created is bash Facebook (how nice that anyone can check that for themselves BTW), but even in that light such childishness is counterproductive.
> if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt
Total strawman. Nobody said or implied that. There's plenty of knowledge and innovation everywhere, but the amount that can come from Facebook only has to be non-zero to support my point. Several hundred developers who have collectively worked on almost every distributed system you've ever heard of might have an idea or two worth discussing. They might even have a perspective on scaling issues that's highly relevant to the problem at hand but not widely known outside of Facebook and maybe three other companies. Why do you try so hard to throw cold water on any such conversations?
You were teetering on incivility earlier in the thread and here you fell straight into it. Please don't! Instead, please read https://news.ycombinator.com/newsguidelines.html and follow the rules regardless of how badly anyone else is behaving.
1. Bosworth's reaction:"for fear it will be misunderstood by a broader population that doesn’t have full context on who we are and how we work"
2. Wrote another: “How fucking terrible that some irresponsible jerk decided he or she had some god complex that jeopardizes our inner culture and something that makes Facebook great?”
3. Back to Bosworth: "The post was of no particular consequence in and of itself, it was the comments that were impressive." (italics mine)
4. And lastly: " If we have to live in fear that even our bad ideas will be exposed then we won’t explore them or understand them as such, we won’t clearly label them as such, we run a much greater risk of stumbling on them later. Conversations go underground or don’t happen at all. And not only are we worse off for it, so are the people who use our products." (italics mine)
In other words:
1. we know better than you
2. the ends do justify the means (contrary to MZ's opinion)
3. don't mind the contents, just notice how smart the employees are
4. let us tell you what is best for you
The comments quoted are pretty shill-ish... but I found myself weirdly sympathetic to Bosworth's concern about leaks. If corporations are people, than their internal communications are like thoughts floating around before a decision is made. If you're properly distributing decision making in a company, individual sections are going to come up with ideas that are embarrassing in the big picture! Would you want your stray thoughts picked apart in the press?
BUUUUUUT (a) Bosworth is at the top of his org, so calling his own post a "not quite staw man" is a ridiculous cop-out. Like it doesn't have 1000x the weight of the comments below. And (b) The reason people are being forced to interrogate Facebook's "character" through internal docs is because a lack of candor has made that character impossible to judge based on public statements and actions. You did it to yourself, man!
But they're not, of course. Sure, I don't want my internal thoughts floating around, but I'm not a) several thousand people, and b) the personification of those several thousand people attempting to work out the best way of extracting other people's sub-conscious & internal motivations and desires, and sell them to marketers ... so my perspective is skewed.
The rest of the excuses in TFA seem very 'some people say...' or 'it's been said that...' style posturing, which may be an interesting academic exercise, but as you say it feels a bit of a cop-out to use that as a post-rationalisation.
It just shows some people inside think that outsiders are too stupid to understand/appretiate what the insiders are talking about, so it better stay hidden. Some of the employees' comments also betray this belief.
Why couldn't a discussion about the future (features? :)) of Facebook could not be open if it's such a big and far reaching platform? What's to hide?
Media would have hard time feeding on it, compared to the current situation where it is all interesting and whatever because some "secret" "leaked" and therefore it is "newsworthy".
Separately, it's pretty hilarious at this point to hear hand wringing about how a leak of private data is keeping FB from having an honest and constructive internal discourse. That's exactly how I feel about what leaking 50M profiles to a political devil bent on destroying constructive discourse has done to my favorite democracy.
Before I went to read the article, I wondered if they would define first and second party here. And they do:
Quote from the article: "In order to leverage the deep pool of data Facebook collects on users, the company mixes information that it obtains from users themselves (Pages a user liked, for instance) with information from advertisers (membership status in a loyalty program, for example) and with data obtained from third party providers."
It is super interesting that they have no terminology at all for data collected via shadow profiles. I propose "zeroth party" data - they are shadow profiles after all. :-)
It looks like just the dark UX patterns of Facebook can be used to write an entire book today. Not that the other companies are any better in this regard.
To delete an item from your timeline from recollection there are three clicks involved - once of which I am pretty damn certain is blocked based on a timer in the back end. The Http request takes like three seconds to complete every time.
Let us reserve our judgment on Google for now. But FB is far worse than what you say here. Suppose I group people into
1. Those who actively use Facebook and willingly tell it everything about their lives
2. Those who actively use Facebook and unwillingly tell it everything about their lives (who don't know what data it collects)
3. Those who once actively used Facebook
4. Those who have never created a Facebook (or WhatsApp or Instagram) account
The real issue is, FB likely has as much private data about group number 4 as group number 1 because of their shadow profiles (which they euphemistically refer to as "future Facebook users"). And their current or past employees seem to be a little too conveniently blind to this matter. See here for e.g.:
https://news.ycombinator.com/item?id=16676720
Google also has a lot of category 4 data; although I don't have any Google accounts I am well aware that they track everything coming from my IP addresses, and from any of their cookies that I inadvertently allow through.
All Google's posturing about 'review and manage your data' is meaningless to me in that regard. Why can't I ask them to delete everything associated with my IPv6 block, for example?
The sibling comments make it clear what the split could be. In fact, people who say #deletefacebook are both overreacting and overreaching, when a much easier solution is at hand. It should really be #splitfacebook - that is an excellent outcome for most of the concerned parties (except maybe FB shareholders, which might as well be a good thing).
Up next: "How I wrote an article that explains how to do something, which actually doesn't do anything, and helps in absolutely no way other than make people feel better they did something rather than nothing"
Up next: how I wrote a comment hating and contributing nothing to the discussion.
Well, it does something: it erases your public comments on FB, and even more so, shows in a few steps how a programmer can go about doing so. Not everything has to be about what FB knows.
Agree. Facebook might still have them in some log or neural thinggy about you, but it will protect you from direct profiling like the Cambridge Analitica debacle. If it would turn out that deleting the activity still allows Facebook to keep them and sell them to 3rd parties, it would be a company-ending event, so there is at least self-preservation.
Why do you presume that once you come off facebook, your data is not used? That is not true, the data gets used and pimped around to the highest bidder. What is true, is that as time passes that data has less value (assuming you actually stopped pumping in the data in fb).
To everyone who is saying "What's the big deal? Wasn't all of this obvious and well known?" - ask 10 people you know who the parent company of WhatsApp and Instagram is (and throw in an irrelevant app in there just to make it less predictable). If your friends list is across a reasonable cross spectrum of age and job titles, I am guessing not more than 2 of them would know the answer. Or, if you want to make it a little more humorous, ask "So what do you think of the recent Facebook scandal?". At least a few of them would say "Oh! I don't use Facebook anymore! I only use WhatsApp"
Because normal users are not techies. They just use whatever they want and not getting into their company information, privacy or other things.
World is about money. You have to realize that that best money come form people who don't give a f$#k. They click on ads, they give their personal data, they buy the most, etc. We can rant as much as we want that Apple does not give us PRO hardware that Microsoft does the same but most of their income is from ordinary people who just use their hardware/software.
> Or, if you want to make it a little more humorous, ask "So what do you think of the recent Facebook scandal?". At least a few of them would say "Oh! I don't use Facebook anymore! I only use WhatsApp"
Because it started with Trump and Trump makes people click on headlines. They have now realised that Facbeook is bad also makes people click on headlines. That's all.
In any half serious newspaper this isn't big news but more of a reminder of a story they ran when Facebook launched this shit.
Someone I know closely worked at Facebook in its heyday, but it has been a while since he left. I asked him around 2014 (he had just left the company) "So what do you think about the way Facebook handles privacy issues?" His response was not defensive at all. Rather, it was a very curious "FB is one of the most open cultures you can ever work in. Any employee can ask any question of anyone at the highest levels and expect to get a honest answer". My thought was "So you didn't have anything to ask questions about?". He was actually a pretty nice fellow, so I stopped asking anything else at that point.
But I remember thinking that it was a very funny, cult-member like response. And you can test this too. Ask your friends who work at FB and I bet you will get some pre-programmed response very similar to that.
1. What was Mark Zuckerberg's response when people asked him if Facebook might be overstepping bounds in terms of data collection (shadow profiles)?
2. What did the company employees think of the backlash over their beacon project?
3. When Facebook told the EU that they cannot match FB user profiles and WhatsApp user profiles to create a single profile (remembering that they would be fined), what was the general consensus among employees? Did they know that FB had lied? Were they still OK with that? If they were, was there not a single person expressing dissent?
Maybe we are agreeing, but we humans don't have sufficient cognitive resources to extrapolate every little action we take into the future and predict new things technology can enable. Our brains are always looking for shortcuts ("do what the crowd does, because there must be at least some wisdom in it"). We also have the reasonable expectation that the companies will not turn our own cognitive limitations against us (such as Nir Eyal's "Hooked" encourages), nor do we know all the tools at their disposal if they so wish, and we probably never expected that these tools could be automated.
In other words, a person who is taking all the necessary steps and doing everything right in terms of the privacy of their digital footprint, is either off the grid, or is likely considered a tin foil conspiracy theorist by peers. Human survival instinct usually overrides such thoughtfulness.
This argument is illogical, because Facebook forces everyone to sign its ToS to use its services, while nobody forces a Facebook employee to leak internal stuff. Said another way, whether or not I wish to have control over my FB data, FB coerces me to agree that it can do whatever it wants with my data. Its not exactly opt-in, is it? Its far worse, of course, if you consider shadow profiles, because it is even coercing people who didn't even explicitly sign up to the ToS. Unless the leak happened via some kind of coercion (which doesn't seem to be the case), your comment is incorrect.
>> In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment.
What? You mean you care about something, but you just won't do something about it, nor openly tell anyone why you wouldn't do something about it, or even talk about it before the issue blows up? Yep, totally convincing.
>> Maybe that would lead to more collaboration on solutions,
Why do people need to "collaborate" on solutions? What do they get from it? Is Facebook going to pay people a share of the profits? If Facebook is a corporate entity which serves its self-interest against people's self-interest (which they have clearly been doing for a long time), what kind of idiot would suggest the people whose self-interest has been affected should now "come to the table" so "we can all work something out"?
>> which is necessary because there are actually some tricky tradeoffs here.
The only tricky tradeoff here is: should Mark Zuckerberg be the only one who should go to jail, or should the entire company be rounded up? It is quite tricky, I do agree.
>> But that doesn't give the same dopamine hit as cutting down the tall poppies, right?
I don't know about tall poppies, but "culling" the "weeds" is the only way to have a healthy garden.