The 2016 presidential election results have wide-ranging implications in a number of fields. In the tech world, we’re already starting to see the impact as Google and Facebook take steps to combat fake news sites (which some people blame for distributing misinformation during the lead-up to the election), and Twitter’s move to roll out new tools to help users fight back against harassment.

social

It’s likely that some of these changes may have been under development before the November 8th election. But their announcement takes on new significance following an election where race, gender, and religion have all played important roles, and the same week that the FBI reported hate crimes were up in 2015, with a 67 percent increase in hate crimes against Muslims.

Twitter is rolling out several updates that give users the ability to mute keywords, phrases, and conversations in notifications to effectively keep abusive users from spamming your feed.

The company is also making it easier for users to report violations of the company’s “hateful conduct policy” by including an option to let the company know if a reported tweet “directs hate against a race, religion, gender, or orientation.”

Last night Google revealed that it’s updating its ad network’s policy to prohibit websites that “misrepresent, misstate, or conceal information about the publisher, the publisher’s content, or the primary purpose” of the site will be barred from displaying Google’s ads, thus taking away a major source of revenue and a potential reason for publishing fake news stories in the first place.

Shortly after that announcement, Facebook made a similar move, barring sites that traffic in fake news from using its ad network.

That decision is particularly interesting given Facebook CEO Mark Zuckerberg’s repeated assertions that fake news makes up a small percentage of what’s shown to users of the social network and that it likely had no impact on the outcome of the presidential election, a view that’s not universally shared and not particularly easy to back up if you don’t happen to work at Facebook.

Of course, it’s impossible to say at this point if any one thing could have changed the results of the election. But hopefully these are some of the frist steps that will help build a world where we’re less likely to encounter publications that game Facebook’s algorithms (or Google’s) to inundate you with articles and headlines about things that literally did not happen in order to influence elections one way or another (it happens on the left and right, although a recent Buzzfeed report suggests more-so on the right).

At the same time, there’s a risk that attempts to block fake news could result in false positives making it more difficult for fact-based reporting to be disseminated. As Boing Boing’s Cory Doctorow notes, maybe the solution is to rely less on machine learning and algorithms and employ more real human beings who can separate the wheat from the chaff.

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,547 other subscribers