Given the history of propaganda and how we got into the Iraq war, this is an area of real major concern for me.
On two different occasions I've noticed a similar behavior on a "trending" news topic that was not trending anywhere.
Both cases (surprisingly) that caught my eyes, had to do with vilification of Iran. The first time, I assumed a certain organized PR effort or perhaps it was naturally trending.
However, the second time (May 9th) [0] it coincided with the Gizmodo's article coming out.
That prompted me to search all major news sites (with the logic that if it was on some major news sites it will then circulate within FB and will explain why it was trend).
So I decided to document this, I checked CNN, HuffingtonPost, NYTimes, WashingtonPost...no mention of an "Iranian missile test" but there was a small blurb at the bottom of the FoxNew site [1]. Not enough exposure for it to get the kind of volume to be the top-3 trending story.
Facebook says: "The list of Trending Topics is then personalized for each user via an algorithm that relies on a number of factors, including the importance of the topic, Pages a person has liked, location (e.g.. home state sports news), feedback provided by the user about previous Trending Topics and what’s trending across Facebook overall. Not everyone sees the same topics at the same time." (My italics.)
If you're interested in Iran, you will probably see stuff about Iran that hardly anyone else sees.
I have zero interest in Iran and have never seen anything about it.
If you look at the image I posted [0] there is trending topic of Justin Bieber. I have zero interest in him, in fact I've never or rarely can I recall have mentioned anything about the pop culture, music or anything else (I like classical music and some classical rock). Given all the different topic I post on FB and Justin Bieber popping up, I can image this is not all personalized news.
I also follow many people who posts news and information on Iran, and it's surprising none of them showed up on trending, except an article with intent of vilification. Seems unlikely.
Even if FB news is personalized, FB guideline indicates they can inject any news they want into this.
I noticed the same thing with the Yahoo News showing up on Yahoo email years ago and last time I checked continuing today. That source also rapidly became hawkish on foreign policy issues.
What you're trying to do is draw conclusions based on very small sets of data. Sure, the article submitted agrees with your conclusion, but there's simply no way to tell if the conclusion you're drawing (and what's stated in the article) is related to your observations.
Your very small anecdotal evidence does not provide additional confirmation of the reported behavior.
The problem is what facebook says is at odds with what the leaked documents purport to show. Trending items, and hand-curated items are not the same thing, and apparently they have been passing one off as the other which is problematic.
It works exactly as FB has publicly stated. As set of topics are algorithmically detected from a variety of sources. Let's say this set has "many many" things in it. Many of which are of low quality. The best comparison I can give you is Twitter trends. Humans approve some of these detected topics to appear in the trending module. Let's just say there are now "many" topics. Yes, curators have the ability to insert a topic in to the trending list, and yes they have ability to make something appear in everyone's trending module, but it's not really used. It's mostly there so that you don't have to wait minutes for trending to detect that aliens have landed on the Washington Mall, or that nuclear war has broken out. In fact, the "everyone must see this now!" has never been used. This candidate set of approved topics is then algorithmically ranked for each person. Using criteria that they said.
If you really care, it uses a GBDT and the newly released Facebook Flow.
Wow, that is horrible. And equally horrible is the fact that you don't seem to find this problematic at all. If the features were/are regularly used is not really relevant: "algorithmically selected" implies a certain non-bias in the process, but if this is not how it really works then one should be much much more skeptical about the validity of any "trending" topics.
It's a common misconception that "algorithmically selected" implies a certain non-bias. Perhaps this myth stems from examples used to explain simple algorithms (e.g. drawing straws, or pulling names from a hat).
Complex algorithms certainly do exhibit bias, and not only that - it's intentional. The whole point of a "trending" topic is to discriminate (i.e. inject bias) trendy topics from less trendy topics.
So you're talking about only adding the bias you want it to add - but I hope you can imagine that's an impossible cause to get 100% correct. There's no rigorous definition of trending; nor is there a rigorous definition of the topics to be selected from, nor are lots of other matters here rigorously defined. There's no way you're going to do better than some fuzzy algorithms trained on real data.
Critically, that tends to mean real data generated by... biased humans.
I would have thought that the algorithm would be entirely agnostic and based on incoming data only without humans to distort it, e.g. I share details with a dozen friends about aliens landing in Central Park, they then share it on and on. Facebooks algorithm will spot this and also others who did the same and then say "Hey, looks like there is a trend growing about aliens in Central Park" and from there would act upon it based on that data only, e.g. this topic was talked about 5 million times in the last minute and therefore bump it up the list to its corresponding position based on other "trending" topics.
Having humans involved will distort that process, thus meaning true trending does not exist. Well, on Facebook at least.
Right. Like the time it was used when it was decided everyone in India got the post about supporting Facebook's Internet.org to save humanity or whatever.
I have a question, mostly unrelated, so I apologize ahead of time. Do actual humans review pictures that are reported? I've been reporting a picture for a few months and get repeatedly "This picture does not violate our community guidelines". It's a picture of the bloody, naked, mutilated corpse of a murder victim with a crucifix shoved down her throat. I have reported it fifteen or twenty times over the last six months with the same answer, that the picture doesn't violate Facebook's community guidelines. What's that about?
That sounds like the sort of question you should ask Buzzfeed, HuffPo or a similar website. Like Google and other companies, Facebook is normally very quick to respond to bad publicity.
There's nothing wrong with it, the problem is calling it 'trending'. Trending implies that these are the things that the users on facebook are talking about. If the list is, in fact curated by facebook employees then that's fine but it shouldn't be passed off as trending.
You are correct, I should have prefaced my statement with, "I think" or "I like to believe." I happen to believe cynicism is a seductive methodology, while sometimes useful. I don't care for FB's hegemony either, FWIW.
I'm inclined to agree with you and the implicit assumption of good faith. However, FB's trend information is an instance of real power and susceptible to the sort of hiding of capabilities and intentions that implies. It's fair for people to ask for verification though of course no one is required to provide it.
It's pretty easy to justify why you would lie, if I think you are an ideologue who truly believes that Facebook trending topics must be curated with a certain slant for the greater good.
When you believe in a conspiracy theory, hearing "why would I lie" doesn't change your mind, quite the opposite. It's also not true that there's "nothing that anyone can say that would prove that any system works as described". There's no general statement that will convince everyone, because everyone has personalized doubt about it.
It's certainly possible to assuage the fears of a conspiracy theorist - take their examples, give a credible, particularized explanation(not a vague "it's algorithmic", but describe the steps in detail), and show some goodwill to address their underlying fears. What you perhaps wanted to say was that you don't really care about convincing conspiracy theorists, which is fine, but very different.
I believe you, I just think that the thing you made is not really minor.
It has incredible reach and is quite effective in claiming attention. I actually had to use uBlock's element hiding function to keep myself from reflexively looking at it and reading a bunch of news stories. I would be surprised if that wasn't true for many people, as that's what it was designed to do.
I guess asking for proof is an unreasonable thing in this case as I can't think of what would actually constitute proof. I do think that additional questions are reasonable in this case as the editorial control of the ranking is currently a topic of national interest. I'm not really sure what that means exactly in this conversation, but I have a vague feeling that things that are big should be examined more closely. That's not super fair to facebook compared with a traditional media company, but this thing is new and isn't really like our models of how media companies like newspapers and magazines work.
It's a little scary because of the power a couple headlines sprinkled here and there have when they're seen be a lot of people. Something that was nothing becomes a scandal, lies become truths in the public consciousness and can't be fixed (look at the antivax movement). Of course, they also have the power to do great justice as well.
I guess, my point after all this text is still that what you've built isn't minor, there will be and probably have been real world impacts from it. Those impacts stem from public trust in FB's editorial position. If the public doesn't trust FB to show unbiased stories, they interpret the state of the world vastly differently than if they trust FB. This is why I think it's reasonable to have questions, even if they may be somewhat pointed at times.
That said, I really appreciate the fact that you took the time to respond to this thread. Talking to the people directly involved is a great way to disabuse one of falsehoods. :)
EDIT:
For an example of a reason why people are interested in this capability, imagine a rouge FB employee were to push the following headline to everyone: "US Nuclear Arsenal Almost Fired at Russian City"
The world would see this post made by one person, and the fact that it was pulled nearly immediately would be seen as evidence by some of a cover up and could add tension and possibly instigate an international incident. Until this generation died, you would have people all over the world that would believe that the US nearly nuked a Russian city without provocation.
We can argue about whether there's an ethical problem, but the assertion that there's any sort of legal problem seems completely arbitrary and partisan IMO.
> Editorial control and algorithmic control are two different things.
It's also a meaningless distinction.
The point of contention is the bias, not whether the bias is imposed by an algorithm or by a human. People would still be complaining if Facebook had tweaked its algorithm to achieve identical outputs instead of employing humans to do it manually.
The truth seems to be somewhere in-between -- it was algorithmically generated and then the output was human-modified.
In general, I think the response to this post demonstrates the prevalence of magical thinking about computers, even by people who ought to know better. Bias is bias regardless of whether it's explicitly programmed behavior, emergent behavior from a carefully trained neural net, or enforced in human processes post-hoc. Distinguishing between these on the basis of where the behavior came from -- rather than whether or not there's a bias -- misses the point.
> I don't see any assertion that there's a legal problem in the parent.
The point of my post wasn't to contradict the parent post. There certainly are people who do claim it's a legal problem.
It isn't meaningless in this case at all. "Trending" suggests that it reflects, so some level, popular opinion about what people are discussing. If the items aren't trending, but are hand curated, then you've essentially lied.
>Facebook says: "The list of Trending Topics is then personalized for each user via an algorithm that relies on a number of factors
In other words, they are lying. 'Trending' should be calculated directly as topics of highest interest over all users. If it's not it's a manipulated figure.
Whether a story trends isn't contingent on it being placed on the front page of major news sites. Most people don't get their news that way. A story is more likely to trend because someone notable talked about it or linked to it.
Are you saying news about Iran's military actions is not important to the US and its allies?
Perhaps users on FB find it more interesting and they surface it. Why does news on FB have to mimic the mainstream media?
Many times we have to read the stories about the US as reported from other countries because the MSM misses them or decides they aren't important enough.
> Given the history of propaganda and how we got into the Iraq war, this is an area of real major concern for me.
This is how the news media has operated in the US for almost all of its existence. I wouldnt be too concerned about it, just take news you read with a grain of salt.
I've said much about this topic, as you can see in my comments. I've since thought about what should be done and I've come to a simple conclusion: Facebook is Mark Zuckerburg's business and enterprise. He can do with it what he wants, just as any business or enterprise has the right to conduct their own business. If he wants to use it to push the a globalist and left agenda, by all means he has the right to do that.
It's up to us, the consumer, to vote with our feet. I haven't used Facebook in years and I'm certainly glad not to be using it now that this comes to light as I disagree with it's agenda.
Does anyone have an alternative argument as to why Facebook DOES NOT have the RIGHT to suppress and promote information based on it's own agenda?
Sure, FB can push any agenda they want. The issue is the deception. The product is marketed as organic and representative, and a number of people are indicating this is not actually the case. I'm not real familiar with current law around this type of behavior, so of its legality I have no idea.
What I would compare it to though are other deceptive business marketing practices like photoshopping before and after pics for weight-loss products, or marketing your dog food as premium, all natural meat based when it is in fact the same junk, bought from the same supplier, as another dog food brand that costs half as much (and is specifically called out as inferior in commercials).
At the least, even if not criminal, it would seem to open the door to civil lawsuits.
Facebook dropped algorithmic news curation "after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds".
Its current "agenda" is based on reputable news sources, plus Fox News (1):
"We measure this by checking if it is leading at least 5 of the following 10 news websites: BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, Yahoo News or Yahoo."
I'd say that was a fair reflection of mainstream media.
If you object to those, what's your personal agenda based on, and how mainstream is it?
(1) Fox News has specifically said in court that it is not required to be truthful.
I was going to vote your comment up, but I refrained. I also didn't vote it down, and I want to mention why.
I don't see the need for you to put those digs in against Fox News.
I get people don't like Fox News. I'm not a fan. But everyone here (on the whole) seems to hold a similar opinion, so the digs aren't necessary.
They weren't even germain to your point. You were making a perfectly good argument that numerous sources from both sides of the spectrum were included. I felt your commentary and singling out of Fox News distracted me from that.
Pointing out that Fox News is not actually a reputable news source is simply stating a fact. It was germane because the other news sources are mostly regarded as legitimate news organizations. Pointing out that Fox News is not and yet is very popular is an important point and completely related to this discussion.
To me it seemed like your point was that they were including sources across the spectrum. Simply noting Fox is in there serves that point.
If you had editorialized about the quality of the other sources then the commentary on the quality of Fox would have fit in.
your comment seems like two disparate things in one. On the one hand it notes that Facebook appeared to try to balance things. On the other hand it felt like a slam at Fox News couched in the form of a comment that might fit in. Clearly you can't write "Fox sucks" as a comment because it wouldn't be in topic. But if you weave it in...
It just distracted me from what I took to be your point, harming the force/usefulness of your comment.
I think you're missing the parent's point. He's not saying that FOX News isn't uniquely bad, or that he wanted to see MSNBC there too. But that adding commentary on the quality of sources in the first place distracted from the already-established point of the post.
Secondly, I too won't claim that FOX News isn't uniquely bad. But the criticism of it that people toss around always feels kneejerk rather than pointing out they're leading the pack of disastrous "news" programming. After a decade of Jon Stewart vilifying FOX News, he didn't get actually people to think critically about news sources - just FOX. And it's not even critical thinking, he just managed to demonize the outlet away from being taken seriously.
What is the purpose of pointing out FOX is a bad news org to a left-leaning crowd? Or a mixed crowd that is smart enough to already know of the well-established criticism of the network? I can't help but see it as perpetuating demonization rather than elevating critical thinking applied to all news sources. In either scenario FOX News is trash, but only one scenario are you actually helping people become better news consumers.
I didn't think it was a dig or gratuitous. It's a matter of fact that Facebook includes Fox News in its news sources, and as EdHominem points out, it really isn't.
I agree that it's a reasonable list for U.S. news. Yahoo is a weird one to have on the list, do they do any original reporting (of real news, not entertainment)? Maybe they're a reliable source of AP or other newswire services.
Yahoo has a pretty strong sports reporting group. They have many of the heavy hitters in basketball media, for example. It makes sense to me that they'd be a common source.
(If you're gonna be That Guy and hurr about sports being entertainment, you can pass.)
You know, I really doubt that Mark wakes up in the morning, meets with his editing team, and says "guys, we need to push the globalist left agenda". The company is massive, so to say that he has some kind of finger on a "left agenda" button is way over simplifying things.
There is evidence that he actually does decidedly (and within his rights) promote the "left agenda". He's been pushing Black Lives Matter within the company, and anyone who expresses disagreement irks him[1].
I think if you read that article, he isn't pushing 'Black Lives Matter', but rather asked employees to stop crossing it out and writing 'All Lives Matter'
His quote from that article explains it
"'Black lives matter' doesn't mean other lives don't. It's simply asking that the black community also achieves the justice they deserve.
We've never had rules around what people can write on our walls — we expect everybody to treat each other with respect. Regardless of the content or location, crossing out something means silencing speech, or that one person's speech is more important than another's."
There may be other cases of him pushing the "left agenda" but I don't think the article you link shows it.
> Otherwise you would not single out a single group as more deserving of life than others.
It doesn't say "Black lives matter more than others."
Everyone who understands why it's "Black Lives Matter!" instead of "All Lives Matter!" also understands why you think it should be "All Lives Matter!", we just disagree. There's no subtlety to your point, it's blunt and easy to understand.
The difference is, people who think it should be "All Lives Matter!" don't understand why we think it should be "Black Lives Matter!". There's a nuanced point that you don't seem to understand. When you suggest that we're saying black lives matter more than any other lives, you make it clear that you don't get it. This is fine, if you stop misrepresenting the intentions of others they might be willing to explain it to you.
Edit: And by the way, you don't have to agree with it the nuanced point in order to understand it. So Zuck insisting people not cross out "Black Lives Matter" is not him pushing that as an agenda -- it's just him showing an understanding that "Black Lives Matter!" is not the same as "White Power!"
> Everyone who understands why it's "Black Lives Matter!" instead of "All Lives Matter!" also understands why you think it should be "All Lives Matter!"
Really? Didn't this slogan arise in the context of a black life trying to kill, or at least harm, another life? How can they possibly defend such a person? And to make him the center of this saying?
It's very clear they mean "Black life matters more than other life." If they meant anything else they would not defend attempted murderers and thugs.
The statement 'black lives matter' doesn't cause white_lives['matter'] to evaluate to false, or to less than black_lives['matter']. Sometimes you call out a specific thing just to assert its truth.
> Sometimes you call out a specific thing just to assert its truth.
It doesn't work that way in English. If you call it out it's because you feel there is something distinct about it that is different from the other members of its group.
Saying "Black lives matter" implicitly says "and others don't", and even if you won't agree to that, at a bare minimum it says "black lives matter more than other lives".
It doesn't help that this slogan arose in a context of defending a black life that was trying to take another life, or at least harm another life. By defending such a person, as they have done, they are very clearly saying "that black life matters more than other life".
Context matters. Every time they use that saying it reminds people that these activists are defending attempted murderers.
If you call it out it's because you feel there is something distinct about it that is different from the other members of its group.
Exactly. The thing that's distinct is that, in the US, it is necessary to assert that black lives matter. It's not necessary to assert that white lives matter because that's a given.
You may not agree with that view, but that's the view held by most people who say "Black Lives Matter!" You seem convinced that your alternative hypothesis is true, that we all think black lives matter more than other lives. Ask yourself honestly if your alternative hypothesis is just a convenient straw man, easy to argue against, easy to use to cast people who disagree with you as horrible people.
I'm curious what you think of the common advice of parents to little boys, "Treat women with respect." Would you start blathering about how you should treat everyone with respect? Obviously you should treat everyone with respect, but there's a reason to call out women in particular as deserving of respect -- because it's all too common for them not to get it.
You may not like this way of getting a point across, but if you pretend it doesn't exist you risk coming off as a person who doesn't understand nuance.
So, now that you understand this, could you please stop suggesting that people who support "Black Lives Matter!" think other lives don't matter as much?
> that we all think black lives matter more than other lives.
No, I don't think all of you think that. I definitely think the founders of the movement think that.
> I'm curious what you think of the common advice of parents to little boys, "Treat women with respect."
Since you asked, I would find that quite sexist and I would never tell anyone that, because it implies that women are inferior and need special protection.
If in fact they do need special protection, then fine. For example the anti-discrimination against blacks laws of the Civil Rights Act in 1964.
Now, I know what you are thinking: Blacks do need special protection, so we need this slogan. Except that the heroes of the slogan are criminals who brought things on themself.
The fact that that is the origin story of the slogan is a big problem for me, and I have a hard time accepting that anyone would support it.
> You may not like this way of getting a point across, but if you pretend it doesn't exist you risk coming off as a person who doesn't understand nuance.
I understand it, I simply disagree with it. It's not the same thing.
> So, now that you understand this, could you please stop suggesting that people who support "Black Lives Matter!" think other lives don't matter as much?
I'll stop suggesting it, when they stop supporting criminals.
There is a pretty massive difference between choosing the values he chooses to promote amidst employees and systematically changing news stories.
For example, I would routinely fire most employees who expressed sentiments similar to Donald Trump's in an office. I would not censor Trump from customer feeds though.
It's more like this: people in tech for the most part have similar political views, so similar that they almost seem common sense. Take LGBTQ rights. Safe to say Silicon Valley is more pro than con on this issue (and benefits from having a larger and more diverse talent pool), so it is easy to see that perspective being the "correct" side of the argument and seeing the other side as mean-spirited, bigoted, on the wrong side of history...you get the idea.
And yet, in essence, the OP could be right, too. Most of us simply cannot know how Mark or various of his minions (at whatever level) choose to use the various soap boxes they have at their disposal.
Rather than oversimplifying, I think the OP is more likely correct and I fully expect Mark and team to wake up in the morning wondering how they can use their privileged access to many millions of users to advance whatever agenda they are supporting that day (political or economic); within some limits of "seemliness", of course. I expect them to act on those thoughts, too.
Perhaps it is an oversimplification to bundle ideologies into a monolithic "left" and "right"; reality there is more complex than that. However, for the purposes of generalization it's not unreasonable to group people together when many of their ideologies bear resemblance to the archetypes of political division. To be overly literal about that is, itself, being overly simplistic.
I think they do have that right, however, the issue is they SAY the stories are 'trending' when in fact they are simply curated by facebook. That needs to be made clear. Passing off curated content as 'trending' is like simply making up polling data and passing it off as public opinion when in fact it's not.
Who says algorithms are the only possible determinants of something which is trending?
The article specifically points out that human editors were introduced after Facebook failed to surface Ferguson protests—making it look bad because those protests were in fact trending on Twitter. Moreover, some theoretical perfect algorithm could have asserted that was a trending topic.
In fact, there is not perfect algorithm for determining what is "trending." Part of hiring human curators is to train an algorithm which can more effectively use judgement in figuring out topics which are truly trending/newsworthy/interesting instead of coincidentally used a lot without being valuable (ex. #FollowFriday).
Moreover, it seems clear that injecting stories is actually relatively rare. The primary reason to have human curators is to do filtering to prevent the junk from showing up in trending.
Im not answering for the OP, but given leftwing idiology presents a history founded on struggle among oppositional groups, the power of any given group should be controlled (as in mediated) by said struggle. For example, Facebook's power to disseminate reporting and propaganda should be mediated by an opposition from the sources and seekers of firsthand accounts.
One could argue that the New York Times or the DrudgeReport also have power to control and persuade. It's an issue of free speech and if it applies here.
There is only one society, and its members ought to have freedom of speech and assembly amongst themselves. Facebook, just like the Internet, is a platform, and it should allow free speech, even if products that are built on top of that platform themselves don't. It seems to be in the nature of networks to be a monopoly; if Facebook or Internet weren't a monopoly, perhaps competing networks could provide free speech and Facebook and Internet could be fully censored. But as it is, censorship and bans diminish the freedoms of speech & assembly of the users.
>> He can do with it what he wants, just as any business or enterprise has the right to conduct their own business.
Unless you're a publicly traded company, which Facebook is. Then you are beholden to the people who are invested in your company; some of which I would assume are probably conservative and right leaning.
If FB was still a private company, then yeah, your sentiment holds true. All bets are off when you're a publicly traded and funded company. At that point, it's the investors who tell you what the direction of the company is. Don't like it? Then buyout all the shareholders and return your company back to the private realm.
>Facebook is Mark Zuckerburg's business and enterprise. He can do with it what he wants, just as any business or enterprise has the right to conduct their own business.
It may be a little nit-picky to point it out, but in the general sense this isn't true. As soon as the company went public it stopped being Mark Zuckerburg's business and enterprise, becoming instead the property of the shareholders who employ Zuckerburg to look after their company. Zuckerburg owns less than 30%.
Hmmmm. This is a pretty interesting (and unusual) story, now that I look into it. Part of the arrangement when FB went public is some of the larger investors ceded him irrevocable proxy rights for their shares, giving him over 50% of the votes even though he owns only 28.2% of voting shares. It's not so much a difference in share classes as a sort of side deal.
But the point remains, even if it's kind of technical when he can always outvote everyone else. He can't treat the company as his personal fiefdom if it crosses the legal interests of other shareholders. If he goes too far they can probably sue successfully.
> He can do with it what he wants, just as any business or enterprise has the right to conduct their own business.
We all have legal rights to do things that are morally wrong. He has a resposibility to the world to be a responsible citizen, just as we all do.
Prior and current generations of responsible citizens are mostly responsible for Facebook: Without generations of people doing the responsible thing, even dying, for democracy, liberty, science and technlogy, education, infrastructure, law, etc. ... he'd be living in a country and world where there would be no technology, legal rights, literacy, or many other things necessary to Facebook.
If you read the article, nothing was being "suppressed". The sources mentioned are extremely tabloid-esque in nature.
This is only biased if we can provide concrete evidence that similar tabloid-esque publications that happened to align "left" of the political spectrum in the U.S. were allowed to pass while their political counterparts were passed up.
If the standard for journalism remained constant, but the sources that were "suppressed" largely happen to be conservative that doesn't seem like a problem with curation, it seems like a problem for the conservative media ecosystem.
I fully expect, despite adhering to basic scientific principles, to be downvoted yet again in this thread, because I already see a lot of the same names from the last time this article was posted and politically charged rhetoric (i.e. "leftist agenda").
The "conservative media ecosystem" in the USA is detached from reality. That was vividly illustrated by the shock at Fox News when Obama won in 2012. They evidently believed enough of their own lies that they couldn't handle the truth.
If Fox News had actually reported the truth all the way through the campaign, they would not have been surprised.
The ironic part about your statement is the fact that the Republicans took back both houses of congress in the mid-terms.
Even if they were detached from reality, I'd say that was a huge punch in the nuts for the Democrats after re-electing their president to a second term, and holding a super majority for 2 years.
You wanna know why Donald Trump has swept the Republican establishment? Because after all the promises by the Republicans about what they were going to do after being elected, they've done almost nothing they promised and their constituents are pissed and sending a message - time for the establishment to go.
It doesn't matter if Trump wins or not. The message has been sent, and if the "establishment Republicans" don't get it, and get in line, then they might as well hand the Presidency back over to the Clinton political machine.
Not ironic at all, it's a common occurrence in mid-terms for the party opposing the president to do better.
You seem to be calling holding a majority in the House and Senate a "super majority," that's not what a super majority is. Generally in the U.S. federal government a supermajority requires 2/3rds in the respective body (e.g. a party holding 67 seats in the Senate would have a supermajority there) [0]. Senate rules also make 60 seats a magic number to reach on many votes but one party hasn't held that many seats since 1979.
The whole media ecosystem in the USA is out of whack. The lefties crap on Fox, righties crap on MSDNC, but both could better be thought of as news organs of the same establishment, with emphasis on opinion shaping, and de-emphasis on truth. Each target a different population segment. Facebook targets yet another segment, and the question is, will it become annexed to the same establishment?
Facebook doesn't target a different population segment or a different political segment of the US, or even the US. It wants everybody to be on Facebook (if they're over the age of 13). That's how it got to $1.65bn users.
But that's a self-selecting audience. Facebook users can either look at tending news or ignore it. It's not a political demographic, or if it is, that's incidental.
It's a segment of the populus who have thus far been uniquely positioned to make up their own mind. That has political implications in a democracy that relies on media to shape the electorate's consensus. As Milo Yiannopoulous recently commented on the 2016 primaries: "If the lying mainstream media ever had the power to shape elections, those days are gone." Will that baton be simply picked up by the new media?
This thread makes it seem like the general opinion is that an automatically generated news feed would be better.
It wouldn't. Not for end users anyways. It would be a marketer's dream.
If you feel like that's the way things should be, write one. Make it popular. Sell it to Facebook. (And use the money to buy stock in your roommate's new online marketing firm)
That's not the point, most people have the impression that the trending news is automated and generated by some sort of algorithm. It's highly misleading. The trends should really be called "Facebook's Top Picks" instead.
Do intelligent people believe this? That's as nieve as thinking hacker news' front page is determined solely by the up arrow. Or that product hunt isn't controlled by insiders.
Children believe in Santa Claus and the Easter Bunny. Is it shocking that they're wrong?
Are intelligent people the only people worth looking out for?
Trending implies humanity cares about it, which to the lowly average-joe means humanity cares about it, so s/he should too. This is the part that feels disingenuous on Facebook's part.
Of course even average-joe knows (though probably can't express) that everything around him just wants to consume his attention, so I'm also not overly concerned about it.
Google understood this long ago. You need human editors/curators who can manually flag spam. Otherwise your index will be filled with spammers who gamed the algorithm correctly. Those that argue differently never was responsible for such a large-scale search/media operation.
That's true. The issue here isn't that Facebook tried and failed to implement an automatically generated news feed. The issue is they were called out for bias in the news feed, and then tried to deflect by blaming it on a computer.
You can't get away from human judgement. If you stop relying on editors' judgement, you'll end up with spam and people gaming the system. To fix the spam, the process of generating the feed will have to be tweaked. The nature of these tweaks (not to mention the design of the original algorithm) reflects the judgement of the human programmers.
Doesn't even have to be commercial spam. With classic hits like Santorum and mission accomplished, I'd think they'd be grateful for a little auditable curation.
The big problem with this Facebook controversy is the amount of people getting their news from Facebook. There are a thousands of other places to consume news, and they do a better job. I almost never click on the Trending topic on Facebook, because I always that it has been curated/customized based on my profile. I actually use NYT / Twitter / Washington Post etc. to get news and trending topics
NYT and Washington Post are subject to the same kinds of biases - in fact, moreso, since what comes up is ultimately in the hands of the executive editor.
Twitter also likely uses manual curation.
If Facebook improves an algorithmic system augmented with human consensus (that diffuses the editorial control among a group of curators), it will be the least biased of all the mentioned systems.
Ironically, Facebook is getting criticized by being more like the NYT and the Washington Post by holding back stories which aren't credible or news-worthy.
I agree, but all of this hubbub is about the Trending News section. Nobody seems to be saying that Facebook put their finger on the scales of what shows up in news feeds after being shared by friends. I suspect that when people say "60% of millenials get their news primarily from Facebook", they are talking about shared articles and the like, not the "trending now" section.
Wake me up when any of the people shitting the bed about this have ever complained about the power wielded by Rupert Murdoch. I might think they were doing something other than having a tantrum about someone outside their own echo chamber having any kind of power over the news.
There's still a heap of people that cry foul about Murdoch, especially here in Australia. News Corp has a near-monopoly on the news here, and people do get vocal about it.
I always wondered - I live in the UK and "trending" topics seemed to be a mix of things that were on the BBC news 3 days ago and obviously promoted content (large company X releases Y).
Completely useless in any case so I tend never to click in that box.
There is absolutely no evidence that you can pay to get into trending.
If there were, it would be a much bigger story than this flimsy article which is rooted in the editorial judgements of a few contractors.
It really annoys me when these drive-by insinuations about "promoted content" are made. Digital media companies—including news outlets, Facebook, and Google—all actually are careful to label anything which is paid placement as such. All these baseless accusations of unlabeled paid placement do is undermine the moral standards for labeling paid placement.
If you are going to make extraordinary claims, please bring extraordinary evidence.
Does there need to be money changing hands to refute promoted content ?
Companies providing perks, quid pro quo services, pre-written news stories and other non financial advantages are part of the game. None of those would trigger any of what would be labled as paid placement, but are still promoted placement.
If you look at the totality of content many news organizations put out, there are a great number of slightly edited corporate press releases of the sort you are mentioning, and people read such things.
It is also worth noting that FB trending topics can be tuned.
Maybe they don't frame it very well but we should probably assume people are making decisions whether its the people who are building the algorithms or hand-picking the content... someone's bias is going to work its way into there.
I suspect this doesn't only happen in the trending news feed.
From my own experience: I shared an article by Douglas Roushkof (https://www.theguardian.com/technology/2016/feb/12/digital-c...) and several of my friends weren't able to see it, even after they opened my timeline.
> The company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds.
This strikes me as interesting, if only because it says something about the audience of Facebook users.
Sense of community aside, the reason I keep coming back to HN is because I trust that the audience will upvote stories that are interesting to me.
I guess because Facebook has such wide community demographics, it can't rely on its users to do the policing.
I don't see a problem: No software is good enough to do it on its own (AFAIK); why wouldn't they improve the quality with human editors? Every other news source does it, AFAIK.
Will there be hearings about Fox News and whether they are biased?
I do have a concern: Facebook isn't a journalism organization, run by professonal journalists with their priorities, values and expertise. They could be very manipulative; many News Corp / Rupert Murdoch publications already do this.
Lawyers, journalists (or in general humanities student) tend to be liberal in my opinion. You can employ them and expect them to be neutral. Self employed people, farmers etc. tend to be more conservative generally. You cant employ journalists and expect them to be neutral. I had always seen facebook's Trending section as "The Politically Liberal Outrage" section right from the start. I thought it was "intended to be liberal outrage" to begin with.
Algorithmic trending based on what is popular with human filters to weed out only NSFW stuff appears to be the way to go forward.
I cant blame facebook. We live in a world where if a company's experimental AI classified black people as monkeys we somehow think that is racist and make the company stop those AI efforts.
In reality a truly neutral platform would trend what is really trending, if it is racist slur so be it because then the media is truly holding mirror to the society. That is how we might get better. Else everything becomes one large safe space which is death of intellectual speech.
"Self-hosted" social networks using centralized authentication and a public API allowing for custom web and mobile clients to be built, sold, licensed etc. Moddable ranking algorithm. Monetize as a PaaS vendor for consumer and enterprise networks. I'll hold your beer if anyone wants to do it.
It's raison d'etre is to expose people to stories beyond the narrow bubble of their news feed. But apparently people don't like the idea of using human curation to get well-balanced stories in front of readers, so it would be much better if people only spent time reading cat videos in news feed.
Also, people need to make a clearer distinction between the trending topics view and news feed. They're separate products with different algorithms entirely.
PageRank was a huge leap in knowing which web pages were valuable.
I'm disappointed we don't have something similar for detecting authority, and measuring the opinions of those with authority.
For instance, if there were an article about chess, I would trust Bobby Fischer and Gary Kasparov to provide the most valuable commentary. If they say it's great, I'm more likely to read it.
But if there's an article about Jewish business owners, I don't want to see Bobby Fischer's opinion.
If there's an article about the government providing welfare and social services, I don't want to see Kasparov's opinion.
How is this not a thing for science, technology, news...?
"This technology article should scare you!" If I see that from Linus Torvalds, it's going to get my attention. From John McAfee? Not so much.
"Nvidia's latest drivers are terrible." If that's from AnimeFan2004, I don't care. If it's from John Carmack, holy crap does it deserve my attention.
More importantly, if John Carmack trusts someone about computer graphics, I'll probably trust them, too. If they trust someone else about computer graphics, there's a good chance I will, too.
We've gotten used to seeking out the opinion of Rotten Tomatoes, or the old Siskel and Ebert thumb-based-metric... It's a shame we don't have a browser extension that brings those metrics to everything we see on the web.
I'm watching the trailer for the new Marvel Civil War movie... and I see a Pop-Up-Video style bubble informing me that Kevin Smith really liked it.
I'm reading an article about how Facebook is bringing internet, but not really internet, to India, and I see EFF crapping all over it. Right there - right on that same web page, because my browser extension brings that content in for me.
I'm reading an arxiv about gene transfer in plants, and I see experts in the field saying they question the methodology.
It just sucks to me that we don't have a PageRank for authority on topics... And we don't have a way to show those scores and opinions, THAT WE TRUST, stuck to the content.
For instance, I trust Al Gore on Climate Change. Other people probably trust Donald Trump on Climate Change. It needs to be per-user to determine their authority-trust links.
Or helping determine our news feed. (To finally relate my comments back to the topic at hand.)
This is probably the worst fucking argument I've ever heard. If you're dismissing Fox News as being biased, then you need to eject all the rest of the MSM coverage too; because they're equally biased if not worse. Fuck, MSNBC doesn't even try and hide the fact they're on the Democratic team, it's practically their own propaganda tool for God's sake.
I'll gladly take a little news from a source that's not obviously pitching me the Democratic talking points night in and night out.
If you check Politifact http://www.politifact.com/ you'll find that most of what Clinton says is mostly true, whereas practically everything Trump says is mostly false.
Fox, like Trump, doesn't feel the need to tell the truth. MSM (1) generally does.
(1) For Facebook, MSM means: "BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, Yahoo News or Yahoo."
Politifact's parent organization, Tampa Bay Times, is part of the MSM, and not a third-party watchdog. It is incredible that they have brought fact-checking into the mainstream, but their ability to have done that is at least in part thanks to their being a part of it. Anyway, they don't provide stats grouped by people (EDIT: they do, my bad!).
Fact-checking cannot provide the kind of metric you talk about for rhetorical devices that are not drily descriptive, such as irony, hyperbole, allusion, etc. It helps that the readers have a way to look up the facts if they don't know them, but the analyses need to be read critically, same as the original statements. It's no good if you distrust one set of people and uncritically trust another. On what basis? You don't trust the politician, that's good. You don't trust some media, that's also good. But whoever in the media you choose, don't trust them blindly, they are just normal people, not superheroes.
Take, for example, this statement: http://www.politifact.com/truth-o-meter/statements/2016/may/... There is no clear-cut right or wrong. Same-day voter registration puts a pressure on the bureaucracy to be more forgiving and less thorough; that's a fact. But is that driven by the political elements who benefit from illegals voting, and do people on either side actually believe that it has an effect? That's not a factual question, that's a question for a political debate. Mr Trump's hyperbole captures this with a surprising pith.
Trump's claim was analyzed. Independent experts were consulted. It's obvious why the ruling went against him.
I don't think distinguishing between hyperbole and lies is Politifact's problem. It's simply calling a lie a lie.
I appreciate that Trump tells lies for political advantage, and that most politicians do the same. However, Trump is in a different league from Clinton, just as the NYT is in a different league from Fox News.
> Trump's claim was analyzed. Independent experts were consulted. It's obvious why the ruling went against him.
> I don't think distinguishing between hyperbole and lies is Politifact's problem. It's simply calling a lie a lie.
When I was growing up in the Eastern Bloc, we looked up to the U.S. for their people's ability to think critically and to have their own informed political opinions. I am horrified at your implying that you outsource your thinking. We used to say, "experts were consulted", "it's obvious", "the ruling went against him". That's totalitarian language, describing a totalitarian concept. Why would you throw away your freedom of thought, willingly to boot?
Let me invite you to analyse that statement by Mr Trump and its analysis by Politfact and my comment above. What is your opinion? Can you give good arguments for it, and anticipate counter-arguments against it?
Nobody is throwing away their freedom of thought, and consulting independent experts is what journalists do. That's how we work (source: I am one).
> we looked up to the U.S. for their people's ability to think critically and to have their own informed political opinions.
Don't confuse opinions with facts. Opinions are personal and facts are (as far as possible) universal.
However, I'm unlikely to have a good opinion of someone who says too many things that are factually untrue, which is the case with Trump and Cruz.
> When I was growing up in the Eastern Bloc
Correct me if I'm wrong, but the Eastern Bloc had rulers who could say things that weren't true and not have those things exposed as lies by the MSM. That's why we fact-check both Trump and Clinton. That's what democracy is based on.
> "the ruling went against him". That's totalitarian language, describing a totalitarian concept.
No it isn't. There's absolutely nothing totalitarian about it. It's exactly the reverse.
On the evidence presented by Politifact, what Trump said wasn't true. If you want to dispute that, you have to disprove the existing information or find more information that contradicts it.
Merely having an opinion about it adds nothing: it's not worth a damn.
This is not complicated. My opinion (the statement nicely sums up a defensible political position, and the Tampa Bay Times' assessment is wrong) is based on the facts they brought to the fore, as well as a few of my own — as any argument would be —, but I also show my own reasoning:
Same-day voter registration puts a pressure on the bureaucracy to be more forgiving and less thorough; that's a fact. But is that driven by the political elements who benefit from illegals voting, and do people on either side actually believe that it has an effect? That's not a factual question, that's a question for a political debate. Mr Trump's hyperbole captures this with a surprising pith.
Politifact [...] is [...] not a[n independent] third-party watchdog. [... Further, f]act-checking cannot provide [data about speakers' telling the truth] for [statements using] hyperbole [...].
Your overarching position seems to be that this stuff is too complicated for you to reason about on your own, and you need to rely on other (presumably qualified) people to do some of your analysis (I disagree); Politifact provides a good unbiased, trustworthy, well-sourced, and most importantly impartial set of facts (I disagree); and the candidates that have the most fraught relationship with truth just happen to be those who you oppose politically (we may agree on that one). That's, I believe, at the core of a big problem with American journalism: the intellectual trepidation to analyse facts independently, based on all available evidence, and present a conclusion that is one's own; in a word, the pretence that a journalist can escape bias by simply choosing between what others are saying, without ever directly challenging their reasoning or assumptions.
That is not really important: all I'm asking of you is to present, preferably without swearing, your own reasoning for why you think Mr Trump gave anything but an honest answer on this particular occasion?
Politifact isn't seen as nonpartisan. The Tampa Bay Times is the parent org and they have not endorsed any Republican candidates for President. They are a liberal newspaper and that is reflected in their fact assessments.
One can be correct and have an opinion at the same time. They're not the same thing.
Even if you don't agree with an opinion, that doesn't make it wrong. You may be even more biased the other way.
If you are not correct, then your opinion is not worth very much. See: birthers, anti-vaxxers, chemtrails, intelligent design, faked moon landing conspiracists etc.
Yes, and then take some news from networks founded specifically to promote Democratic ideals and viewpoints and draw my own conclusions.
Something far more meaningful than people I know who simply dismiss Fox News since they are right leaning. I'm actually surprised how many stories end up on Fox News that's never reported in the MSM.
If you're going to be myopic in how you get your news, why should I be surprised your views are just as narrow as the sources you get them from??
That would apply if there were honest, inevitable biases, and the journalists and editors were striving for integrity. However, in a situation where they are at best not very good, and at worst they are trying to manipulate the public opinion en masse, how can you use that kind of data? Isn't it garbage in, garbage out, no matter what your process of synthesis may be?
In the good old days of the Soviet Union, we could "read between the lines", because the censorship system was not very effective, and the journalists were pissed off. But we still failed horribly at understanding what the world looked like west of the border, and imagined capitalism was some lesser form of paradise, and were quite surprised when we had to start living it.
Personally, I am weary of reading people who I know try to manipulate me. I always suspect if they try hard enough and long enough, they may succeed. What is your method, and does it work?
There is a difference in that Fox news isn't the only global news network whereas the facebook platform which delivers news among other things and through which a good percentage of people receive their news is in essence the only global platform.
So, in some ways they are a news feed monopoly because they have little competition in a space where for a significant proportion of the population it is their main news source.
On the other hand, I can see the attraction to curate/censor the news.
Let's say there is an event in some volatile country and there is news which if widely exposed would essentially pour fuel on a fire but the event which would stoke the fire is not incidental but you know could be used as an excuse for violence and result in numerous deaths. I can see the desire to keep people alive and protecting property by moderating the news, at least temporarily till things cool down.
I never imagined there would be any regulation for cable networks content, but apparently the FCC has come up with a host[1]. The equal opportunities rule [47 CFR 76.205] and what is left of the fairness doctrine seem to would have been violated, were Facebook regulated as cable.
I'm actually serious when I say this a huge opportunity for Twitter. Twitter has the possibility for so much randomness, which I would say is good for the health of the world. Its a public free for all. Facebook, as Stratchery [1] and others have said, facilitates a group think world. You interact with your friends who all think like you.
I'm an investor in Facebook and no longer one in Twitter, but I can't think of anything more I'd like than to lose than my money on Facebook.
Twitter is already more overt than Facebook about its left-leaning censorship. Cf. the Trust & Safety Council, shadowbanning tweets and accounts, de-verifying accounts, and manipulating trending topics in a similar way to Facebook
I don't know about how Twitter selects "trending topics". But I find it strange that a "Trust & Safety Council" and other tools to help protect people could be considered "left-leaning". I'm pretty allergic to politics, especially politics being injected where it doesn't belong, but I'm just not seeing the connection there.
It's the other way 'round. Someone has a pragmatic stance on something, and the SJWs label it as political, and ban it. They also ban or not ban people because of their politics. It's one thing to ban all politics, another banning just those you oppose (and bonus if it's just an excuse and the real reason is money or connections).
You don't have to look further than Twitter for examples of all of the above.
If you want people protected from harassment and bullying, perhaps it follows that you would enlist the help of some of the more valiant SJWs. After all, who should know more about bullying than bullies themselves?
Facebook/Instagram and Twitter are so botted out by people paying for likes/comments/retweets, PR agencies, market manipulators, sentiment analyzers and state-sponsored disinformation campaigns that I would say these are not the democratic free for alls they once were. Not to mention gone are the days of strictly chronological feeds - now the algorithm that chooses what users see is a pay-to-win blackbox.
There will be a day of reckoning for the social media giants where they face telling their shareholders that XX% of their users... don't exist. And then soon there will be a whole tech-point-o iteration of new social media startups who focus on ensuring their user base is real, and that their feed algorithms can't be gamed by money or teams of people working artificially (with bots or AI) to influence opinion.
On two different occasions I've noticed a similar behavior on a "trending" news topic that was not trending anywhere.
Both cases (surprisingly) that caught my eyes, had to do with vilification of Iran. The first time, I assumed a certain organized PR effort or perhaps it was naturally trending.
However, the second time (May 9th) [0] it coincided with the Gizmodo's article coming out. That prompted me to search all major news sites (with the logic that if it was on some major news sites it will then circulate within FB and will explain why it was trend).
So I decided to document this, I checked CNN, HuffingtonPost, NYTimes, WashingtonPost...no mention of an "Iranian missile test" but there was a small blurb at the bottom of the FoxNew site [1]. Not enough exposure for it to get the kind of volume to be the top-3 trending story.
[0] http://imgur.com/UNRBrRu
[1] http://imgur.com/1f13R80