Some people talk about artificial intelligence as a ‘black box’. At Vubble, there is no such thing. Here’s Mariah Martin Shein, Vubble’s Director of Machine Learning, with an inside view.
- Regulation: Americans are not keen to regulate their platforms.Turns out that First Amendment (the one about freedom of speech) is really robust, and the net neutrality mentality has a lot of people studying misinformation saying that not too much can be done to regulate the platforms. Department of Justice Deputy Assistant Attorney General Adam S. Hickey explained at MisInfoCon DC, “Transparency, not prohibition, has been the government’s response to misinformation.” The DOJ will pass information to the platforms but it won’t regulate them. Check out his full remarks. Outside of the U.S, Germany is leading the charge in regulating the platform press. And Facebook Germany has hired scores of real people to review content, particularly for hate speech. This did lead to a high-profile piece of content being censored and that has the free speech advocates nervous. On an interesting note, Dr. Haroon Ullah, Chief Strategy Officer from the Broadcasting Board of Governors, urges would-be platform regulators to think outside of national borders. He wants us to consider languages, not countries, when creating solutions for misinformation (think misinformation in Russian and not in Russia).
- Facebook’s data: Researchers from Misinfocon who are studying misinformation really want Facebook to release its data. Other platforms (ie. Twitter) have done so. Researchers want to dig around and see what they can find in Facebook’s treasure trove, including how many people were really exposed to the Internet Research Agency US election content. There seems to be the general feeling that Facebook thinks PR first and public service second. And there are some brave tenured professors who are willing to do what they need to do to get the data they need.
- Cool tools: There are some really cool tools that researchers at the Observatory on Social Media have developed to help people identify misinformation campaigns online. Here’s an awesome tool for figuring out if a twitter account is actually a bot called the Botometer. And another tool from the same team for reconstructing the diffusion networks that allowed a lie to spread called Hoaxy. Here’s a quick video overview of these tools.
- Media literacy: There’s general agreement that media literacy is needed, but there’s little agreement over how to deliver a media literacy campaign or even who should do it. There are lots of cool small scale experiments going on including KQED Education in California. They’re focussed on elevating teen voices through hands on media producing. I like the work that Jevin West is doing at the DataLab at University of Washington Information school. He’s trying to help his student be critical not cynical about the science information, particularly numbers, they consume. (You can also see Vubble’s credibility meter, our media literacy tool, on videos that appear on our website).
- Provenance (noun); the place of origin or earliest known history of something. It’s a beautiful word, and an essential one for understanding misinformation campaigns. Knowing where information began is key to assessing its credibility. And while we’re defining words. Misinformation means false or inaccurate information. Disinformation means intentionally false or inaccurate information that is spread deliberately. The difference is the intention of the spreader. (If you want to know more definitions and frameworks for describing our collective Disinformation Disorder; this report is a brilliant place to start). One final word on definitions, “fake news” is a useless bipartisan hammer. Let’s ditch that phrase all together.