What Algorithms Can Learn from Journalism
Inside Flipboard / January 26, 2018
There’s no doubt that people, companies and even governments are weaponizing social media platforms by paying to game the system: the algorithms and intelligence used to do such benign things as recommend a story your friend posted or a book you may be interested in reading. These machine learning engines, in the simplest terms, look at the behaviors of an item and then boost or share that item to specific recipients based on their potential interest in the item.
These systems were designed to give people interesting stories relating to family, friends, and the world around them. But bad actors quickly figured out how to use these systems to appeal to the underbelly of human nature and incite or inflame already divisive ideas. And so here we are today, in a post-election era of confusion around the news we get, with reasons to question the intentions behind every story.
In my talk at Web Summit, I suggested that it’s not the fault of the algorithms, dutifully doing their job, but that instead the parameters and guideposts need to be reset.
This is where journalism and editorial values have something to teach us all. Journalism has rules that remind us of our responsibility in storytelling and give us clear ways to preserve the integrity of how news is surfaced to the world. I would suggest that our approach to algorithms follows five journalistic principles:
1) Truth and Accuracy
Lies travel faster than the truth, and people start to believe. There is no algorithm for truth? So how do you do it? Put editors in charge—people who can separate fact from fiction—and have them advise on what goes into the algorithms. Journalists can decide what’s quality, truthful and accurate. Continual monitoring and improvement helps ensure the algorithms are not abused; watching the output over time is paramount to the process.
Editors ensure algorithms are working in service to the audience, not the business model inputs. When you look at the algorithms surfacing clickbait stories and ads, you see how gamed content has become and this is a huge problem for the web, where content isn’t determined by what’s best for the viewer but by what drives the business model.
3) Fairness and Impartiality
Divisive and polarizing content can be favored by algorithms and drive opinions apart. As they say, there are two sides (at least) to every story, and journalists work to discover and listen to diverse voices. Editors working with engineering can tune algorithms to feature multiple points of views so that we have greater understanding and empathy in our world, not less. How can we be united and work together if we don’t understand why we disagree?
The press and social media can be very powerful and dangerous in people’s lives, and journalists understand this responsibility. The human impact is why stories must be carefully researched and backed up so that the power of the medium isn’t used for harm but for exposing or celebrating the truth. Algorithms have no responsibility to the lies that can get proliferated. Editors and engineers can partner to design algorithms that restrict the flow of harmful stories by watching and adjusting what’s recommended to people on the platform.
No system is perfect. When errors are made—such as when false stories are trending or bad content is being surfaced to your audience—they should be immediately corrected and the algorithms should be adjusted. As with editorial teams and quality publications, mistakes are recognized and fixed publicly to preserve trust and integrity. Using algorithms does not excuse platforms from these same responsibilities.
Algorithms are a fact of the internet. But they cannot be left to their own devices. People created them and now people are responsible for how they act. By blending algorithmic approaches with journalistic sensibilities, we can begin to solve for the morass of fake news and horrible content that riddles the web today. We have devolved to a place where people no longer trust what’s shared with them or what they read on the internet.
This was not what was intended. The web was built with the hope that by sharing information with people around the world, we’d create greater access, equality and global advancements that we could only previously dream of. But today, the power of algorithms, gamed by bad actors and even other governments, has not just ruined the experience by also striped our trust. I believe we can massively minimize this cancer in the Internet if, as an industry, we jointly tackle these challenges and share trends we see in our algorithms. We need to identify bad actors more quickly, minimize the spread of fake stories, and reclaim the quality people deserve to have on the Internet.
Algorithms have to be held accountable. This takes guts and a set of principles to live by.