Increasingly online we are shown what others think we want to see, based on algorithms. That may or may not be a great thing. Journalists and editors should treat algorithms as yet another beat to cover.
Aside from a deep love of (hopefully) cheesy horror movies like this one, Oculus, I happened to be in Times Square in New York City the other day, near the Port Authority Terminal, and saw a billboard with the movie’s perfect tagline: You See What It Wants You to See.
And, no, this is not Oculus the company Facebook paid billions to buy.
The movie tagline is perfect for this essay because it captures the spirit of an article I read earlier this month about algorithms, how they are reported on in the media, and what could be done better. The article reports on a paper by Nick Diakopoulos, Algorithmic Accountability Reporting: On the Investigation of Black Boxes, where he writes:
We're living in a world now where algorithms adjudicate more and more consequential decisions in our lives. It's not just search engines either; it's everything from online review systems to educational evaluations, the operation of markets to how political campaigns are run, and even how social services like welfare and public safety are managed. Algorithms, driven by vast troves of data, are the new power brokers in society.
Without realizing it, by moving much our lives online, we’ve become subject to a silent and mostly obscure set of rules we often do not control. These rules hide content from us. And the rules limit how deeply we can go into a subject, in some cases, because we don’t see those options.
The most obvious example is the Facebook News Feed, for people who use the service. Because I have limited time to participate, I almost always use the news feed to catch up. Yet at least one person has told me that’s silly, people post stuff on their home pages which might never get into my news feed. I don’t get a complete view if I don’t click through to the dozens of home pages of family and friends.
The article about the need for an algorithm beat to be covered in the media, Interviewing the algorithm: How reporting and reverse engineering could build a beat to understand the code that influences us, describes how little reporting is done around these algorithms or rules. Most or all are controlled by private companies with little or no oversight. While most people trust other people, trust can be abused. And, as the article points out, limiting content creates problems related to free will, even the idea of what is a democracy.
Equally interesting, the article points up the more complicated algorithms are some times not understood by their creators. The algorithms create side effects that show up, for example, crashes from high frequency trading on stock markets, or not, the person in search of the perfect cashmere sweater never sees the sweater they want to buy. Or the voter who wants to fact check a negative claim about a politician never sees (or has to go through a lot of effort to find) the larger context so they can decide how they want to vote.
Some of this effect from algorithms simply mirrors one aspect of human life: our choices today limit or expand our choices tomorrow. However, with algorithms the choices are created by other human beings like us and used by large often private companies with little or no oversight. Plus, even if we make stupid choices and starve, presumably we have a basic human right to food as a member of society. The alternative is the law of the jungle.
Who controls these rules in the media is striking. In the US, in 1983, there were 50 media companies who controlled 90% or more of what Americans saw, heard, and read in the media. In 2010, six companies controlled 90% or more of what people see. Supposedly 232 media executives control what 227 million Americans experience in the media.
While media consolidation — and the same severe consolidation of banks, airlines, and many other industries — has happened mostly under the radar for decades, the algorithms used to present content matters most. People cannot make informed decisions if the 232 media executives decide to ignore an issue. In theory, people should be free to pick and choose from the entire stream of possible content. And people should understand how their access is limited by algorithms, as well as how to get around any restrictions.
Journalists should create a beat to cover algorithms and there should be more social awareness and control over how all of us interact with algorithms. We should see what we want to see, regardless of our political, social, economic, or other interests. The world and people are extremely diverse. Technology should reflect this diversity.
And no worries, I’ve also linked below to the Oculus movie site on YouTube for those who want to check it out. The Times Square ad I saw is part of the promotion in advance of their April 11, 2014 opening here in the US. My preference is to see horror films at 10 a.m. with popcorn for breakfast on a sunny day, basically the opposite of the often dark and dreary world of horror movies.
Interviewing the algorithm: How reporting and reverse engineering could build a beat to understand the code that influences us
Algorithmic Accountability Reporting: On the Investigation of Black Boxes
Automating Layouts Bring Flipboard's Magazine Style To Web And Windows
An excellent description of an algorithm used to present content in the Flipboard app.
These 6 Corporations Control 90% Of The Media In America
The Business Insider article has the infographic. The FastCoDesign.com piece is opinion but the comments are diverse and often thoughtful.
Oculus (the movie)
In case you want to see the trailers. Movie is out April 11, 2014 in the US.
Also In The April 2014 Issue
Andrew created Bits & Bytes, a fun card game to teach kids computing skills: logic, problem solving, and critical thinking.
Tim describes how he created his game company, Glide Games, and with his young son created two video games, Elevator Adventures and Subway Adventures.
A 4,000 year old Chinese magic squares puzzle is both fun and a way to learn basic problem solving skills. Plus turtles.
Can you name the first video game? The first game likely was Tennis for Two in 1958 but it could be Space Wars! in 1962 or other games. It's complicated.
The story of an English-speaking person learning a little Japanese by playing the latest Harvest Moon game, Connect to a New World, in original Japanese.
Wendy Norman, the Director of Social Good at Microsoft, talks about the history and features of the Skype in the classroom service for teachers.
You can use the concepts of game play to turn almost any task or information into a game. Assuming you can define game play.
The Greatest Show and Tell on Earth turned out to be tons of fun for kids and parents, plus a place to wander and find new technology. Links and video.
Links from the bottom of all the October 2014 articles, collected in one place for you to print, share, or bookmark.
This language lets you modify the Skyrim game to learn game coding plus have fun adding objects and functionality to the game.
Interesting stories about computer science, software programming, and technology for the month of September 2014.
Three game creation software tools you can use to create games. Includes a brief description and lots of links to these and other game creation tools.