"Political parties shall participate in the formation of the political will of the people," states Article 21 of the German Basic Law. Facebook is not covered by the passage, regardless of it and other internet platforms’ potential roles in the 2017 German election. Experts speak of filter bubbles and echo chambers these media create that reinforce users’ beliefs, rather than promote democratic discourse.
Filter bubble burst?
The term was first coined by the American author, Eli Pariser, in 2011, to define personalized Web searches which a website algorithm selectively guesses what information a user would like to see based on information about the user. Communication researchers lack consensus if such a bubble actually exists, as different studies have come to different conclusions.
The German platform AlgorithmWatch released a report about the election absolving Google of the charge that it locks users into their respective political bubbles. More than 4,000 volunteers submitted predetermined search requests about German parties and politicians. In most of the nearly 6 million test cases, the first eight to 10 results were the same for all users. This included those who searched while logged in to their Google accounts, about whom Google would have had the most personal information.
"The entire filter bubble theory is shaky at best," said Matthias Spielkamp from AlgorithmWatch. "It assumes that these people don’t get any other information" from any other source.
More an echo chamber than mendacious media
It may be a different story for those who do not believe the mass media, as is the case for many potential voters for Alternative for Germany (AfD). This is a trend brought over from supporters of the Tea Party movement in the United States, who get their information from "alternative" sources. A culture of sharing on Facebook has emerged from this, one that promotes — and distributes — a user's particular worldview regardless if it is true or not. And when this independent "news" doesn't match what's on TV that night, this only reinforces a perception of a press that isn't telling the truth.
While Google may be able to proclaim its innocence, Martin Giesler told DW, Facebook is a different matter. The founder of Social Media Watchblog said that unlike Google, which sends users to other websites based on valid search results, "Facebook does everything to make users feel comfortable on the platform and this means that users are not confronted with things that challenge them, but rather with what confirms their worldview."
The longer a user is on Facebook, the more ads Facebook can present the user. And that is how Facebook earns money.
Giesler says this unintentionally leads Facebook down the same path as AfD, which is more about fear and emotion than hard facts. "That Facebook is a platform extremely well suited to packaging and presenting emotional content is beyond question," he said. "There's no reason at all for Facebook to remain sober or factual, because anybody can be boring."
Technology platform with social responsibility
Facebook has always had the same defense to the charge of echo chambers — such as, for example, extremist content during the US presidential election: It is merely a technology platform, the actual content comes from users. Although the company's position has begun to change, Giesler said, he wants to see more transparency about how the content is distributed on the platform and which advertising comes from where.
Some content on Facebook does not come from ordinary users. The company reported uncovering about 3,000 ads, with probable links to Russia, during the US election that could not be immediately identified. In response, Facebook founder and CEO, Mark Zuckerberg, released a nine-point plan to counter election manipulation. In advance of the German election, Facebook said it had deleted "thousands" of accounts that had allegedly spread false information.
Media researchers may debate for years the extent to which social media played a role in the 2017 German election. A comprehensive investigation would require users voluntarily handing over a large amount of personal data.
"You’ve got to ask: How far do you go into users’ privacy?" Matthias Spielkamp said. "Without getting their private data, you can’t get reliable results."