Attention is a valuable commodity on the Internet. Platforms, especially social networks, want to retain their users for as long as possible, and to do this they also resort to a bag of technical tricks. Whether it’s TikTok, Twitter or Instagram: with every visit, there are new posts to discover and rate.
The algorithm is responsible for this, i.e. practical instructions for computers on how to deal with user preferences. However, “likes” in the form of a thumbs up button is by no means the only factor in the recommendations made, as explained by algorithm expert Julia Neidhardt from Vienna University of Technology in an interview with ORF.at.
Systems that can do it all
These recommendation algorithms have evolved greatly in recent years. Modern systems “can — frankly — everything,” according to the computer scientist, and they make decisions based on a plethora of factors. According to Neidhardt, clicks, friends, location, technical details like mobile model, operating system, and even the current weather play a role, along with countless other variables.
Unlike 15 years ago, social networks are now an essential part of everyday life – whether in elections, in crises or in opinion formation – and therefore highly political. Experts are even more skeptical that the power over these “black boxes”, which hardly anyone can see, lies primarily in the hands of very large companies on the Internet.
Big plans for the European Union
The European Union wants to change the following: The Digital Services Act, which aims to regulate the power of large network companies, also contains regulations for algorithms. An explanatory paragraph from the draft, which finally removed another major hurdle at the end of April, states that “platforms very large on the Internet must ensure that users are ‘appropriately informed’.” Moreover, “the most important parameters of recommender systems should be presented in a way that is easy to understand” so that it is clear how information is prioritized for them.
Technical details were left out, which is not entirely uncommon in such proposals at the EU level. The response from data protection officials has been mostly positive. But TU expert Neidhardt also sees a limitation: “What there will be no exact explanation of why exactly this thing was shown at a given moment.”
Neidhardt as well as John Albert of the NGO AlgorithmWatch point to ORF.at for similar explanations of an ad on Facebook: There could be a reason why a particular ad was shown. Criteria such as “a person, female, living in Austria, interested in politics” can be read there. This function does not exist for publications – but the EU initiatives at least want to bring more transparency into the recommendations.
Even the developers themselves are often confused
However, this also poses a big problem for the companies themselves: it is often difficult or impossible for the developers themselves to understand the decision of the algorithm. Among other things, Albert points to a similar Twitter project from the previous year, in which company employees examined whether the algorithm would boost political content on Twitter — with the result that this is the case.
This is mainly due to rapid advances in artificial intelligence (AI), according to Niedhardt. As a result, algorithms develop a certain degree of autonomy – first they are trained to act, then it is up to the machine to make the decision – and the more factors involved, the more ambiguous the basis for the decision.
However, due to developments in recent years, these factors have multiplied – also because systems have been getting “smarter” all the time. A few years ago, you had to diligently prepare all the data in order to issue recommendations based on it, but modern systems, which rely on “deep learning” among other things, will be able to process and weigh a large number of content directly, according to Niedhardt. For example, extracting sensations from a text in this way, that is, “reading” them in practice, is no longer a big problem, according to the expert.
Imbalance between companies and the public
Albert, meanwhile, notes that companies make conscious decisions when designing their algorithms — and those decisions will have “measurable impacts” on how we treat information, not least with each other. According to Albert, there is an “asymmetry of power and information” between platforms and the public in terms of how algorithms work.
Among other things, he points to “Facebook Profiles,” which would have shown Facebook was well aware that its modified system recommended more misinformation and violent content. However, the management of the group opposed the changes to the algorithm – not least because of increased user participation.
Companies are still blocking investigations
Until now, the corresponding investigations have usually had to be done without the help of large companies – which only allows a superficial look at Facebook and Co.’s rating mechanisms. , If any. Because such research often allegedly violates the platforms’ terms of use – the Instagram search project AlgorithmWatch, which relied on data donations from users, has been discontinued. A similar project by researchers from New York ended up on Facebook banning the accounts of professionals.
This is also set to change with the Digital Services Act – independent reviews of the algorithms are planned as well as detailed information about their standards. “Platforms will have to examine the actual and potential risks that their services pose to the public,” Albert said.
But analyzing existing algorithms doesn’t do the trick, especially for data protection advocates. Alternatively, algorithms for the network giants are also being discussed, most recently at the CPDP Data Protection Conference in Brussels at the end of May.
Experts call for alternatives to their own algorithm
In an expert discussion on the topic, Katarzyna Szemilewicz of the Polish NGO Panopticon criticized the current situation, saying, “We cannot choose another interface or another algorithm,” and instead, large platforms will monitor our online behavior and then “analyze these feedback.”
By “understanding who we are” they had “analytical power”. Understanding the sore spots and trends only enhances their power and prevents competitors from offering similarly attractive services. Data protection officer Raygan MacDonald, who used to work for Firefox browser manufacturer Mozilla, sees things similarly: “Those with the most power also benefit from this feature,” says the expert.
Expert: Google and the new Amazons do not solve the problem
Szymielewicz points out that it’s not just about “domination” but “mistreating people.” More companies built on the same business model as Meta, Google, and Amazon won’t solve the underlying problem. Instead of providing everything from one source, users should be able to choose who provides the algorithm or interface.
This should then allow an alternative to be chosen in place of Facebook’s recommendations. According to Szymielewicz, it is conceivable that an NGO might provide a corresponding algorithm. Suddenly there will be no work behind the algorithm. Big companies’ hunger for data can be circumvented, which in turn allows them to continue to grow, at least according to expert theory. However, for many companies, this will likely also equate with a breakup – it is doubtful that such rules can ever be enforced.
“Digital Services Act” is the first step to a great mission
As a first step, the EU Commission’s ambitious plan aims to give internet users more transparency than before. In order to return some power from the Internet giants to the hands of users, it will not be enough to explain to them how to make recommendations.
It would take nothing short of a complete and permanent opening of the “black box”, a once well-kept industrial secret, so you can check at any time that the algorithm is still doing what it was originally designed to do. Until Brussels finally kicks off its “digital services law”, it is expected that there will be a lot of resistance from Silicon Valley.
Lifelong foodaholic. Professional twitter expert. Organizer. Award-winning internet geek. Coffee advocate.