Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The problem here is that misapplied empathy can lead to terrible decisions.

That's not the problem, that's a vague wave at a generic class of innuendo that could be used just as easily to rationalize not allowing your child to eat ice cream or Japanese internment. You have to make the case why Google changing their 2FA system is so much more important than the homeless having phone service, you can't just say "sometimes, empathy can be bad."

I'm not getting that from the rest of the comment, which seems like a gish gallop around a bunch of other things that we're also not going to do for the homeless, and about which you or somebody else can say "it's only human to be worried about other people going through these issues, but empathy can be bad. The answer isn't that HUD should change the second line of the third section of Form B, it's that we should fix the homeless problem completely."

edit: We can't use as an excuse for not making small changes that we should be making larger changes. The excuses that one makes to avoid making small changes will apply more so to larger changes.



I can make a very specific case for it. Out of 1.5+ billion users, millions of which are barely tech-literate and vulnerable, with gmail a constant target for malicious entities. That means intuitively at least hundreds of thousands of vulnerable people getting cleaned out of their life savings. Changing things for billions in exchange for a marginal benefit to thousands is bizarre.

It's not a 'gish gallop' but a framework for looking at the issue. I'm not saying that empathy is sometimes bad, I'm saying that it can't be the starting point for our reasoning. It can be the impetus that makes us act, but the actual solution should come first. Sure, maybe none of the things I'm proposing will be implemented. Maybe they're all godawful ideas, but I can't fix the problem in the five minutes it took to write the post or even five decades of intense research on my own. But it's clear that keeping to that pseudo-empathy performative martyrdom mindset is an active roadblock against the more ambitious solutions. And it leads to truly awful ideas such as getting rid of encryption, rights, and so on.


So you don't want Google to do anything or what is the purpose of all this verbiage? Which moreover, unjustly dismisses whole issue as "marginal benefit to thousands". Being able to keep/recover email address is so much more than a marginal benefit, and there are many more than thousands of homeless in the US alone.


Maybe Google can do something. Just it probably shouldn't be something that alters security measures for billions.

I'm not dismissing the whole issue, just that it was presented in a way that's not actually conducive to helping the homeless.

If you remove forced 2FA, you would be dismissing the hundreds of thousands (at minimum) of tech illiterate people out of the 1.5 billion users who would get cleaned out in the coming weeks. Why do their lives not factor into your calculus? Are they not vulnerable too? All of this for a measure that could be resolved in so many other ways.

This is the problem I'm trying to illustrate. This sort of moral appeal helps no one, and in fact endangers other populations. If the goal truly were to help people, no one would EVER suggest an alteration that would expose billions for the benefit of thousands.


You really expect people caring for homeless to come with some ready made technically feasible solution? Of course they will do moral appeals and suggest potentially dangerous solutions first. That happens all the time! Getting a response "that aint gonna work get away" isn't appropriate here. Dialogue is, and for that we must listen a bit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: