Regulating Facebook's news feed algorithm

On 18th September 2021, the Wall Street Journal published the Facebook files.

In particular, following the investigation, Facebook inc. appears "to know that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands". The central finding of a Wall Street Journal series is based on a review of internal Facebook documents, including research reports, online employee discussions, and drafts of presentations to senior management.

It also appears that Facebook’s researchers have identified the platform’s ill effects and, despite congressional hearings and media articles, the company didn’t fix them. The latter are problems allegedly known by Facebook's leadership.

Following the words of the whistleblower, Frances Haugen, at the basis of such concerns, there is Facebook’s algorithm.

Ms Haugen stated as corporate decisions, product design, and incentive structures often prioritize profit and growth over public safety and democratic responsibility.

Moreover, some findings show that  Instagram’s algorithm exploits teen girls’ insecurities to show them posts related to extreme dieting and even self-harm. 

The whistleblower highlighted that the above problem, together with others such as the algorithm promoting “meaningful social interactions” between users, were underplayed by Facebook.

The former Facebook employee specifies as governments should focus on ensuring greater accountability and transparency over the companies that shape it.

The way to do it is regulating the Facebook algorithm, as the other social networks' ones. It is not clear if the proper way to do it is by regulating algorithms themselves or at least making companies more responsible for their effects.

The Washington Post represents as, in the past year, at least five bills have been introduced or reintroduced in Congress that focus explicitly on the software programs that decide what people see on social media platforms.  In conflict with the US First Amendment, the latter constrains the government’s power to regulate companies’ speech policies.

The same article highlights as Ms Haugen endorsed a bipartisan bill called the Filter Bubble Transparency Act.

The latter would require Facebook to explain their algorithms to consumers better and to offer everyone the option of a feed that isn’t manipulated by ranking software.

This bill could represent a good solution for each State trying to regulate the social networks’ algorithm and an easy-to-implement regulation.

Luca Megale
is a PhD Student at LUMSA University of Rome 

and tutor of the European Master in Law and Economics - EMLE (Rome term).