An independent body, whose creation was imposed by European regulations, will settle disputes. It goes live this Thursday, November 14.
Internet users can now challenge moderation decisions by TikTok, YouTube and Facebook. As of November 14, a European “court of appeal” (Appeals Center Europe) is responsible for hearing their requests. Its creation was imposed by the Digital Services Act, which came into force in August 2023. Disinformation, harassment, deepfake, online hate, minors’ rights… The scope of action of this private organization, which claims to be ” independent and impartial”.is wide-ranging. ” This is an opportunity for users to regain power over the content they post, and exercise a form of control over the social network ecosystem, says Thomas Hughes, head of the Dublin-based Appeals Center Europe (ACE). It’s a silent revolutione “.
While the scope of this court of appeal is currently limited to three platforms, it should be extended as of next year. The same applies to languages. Initially, the ACE will handle disputes involving content in French, English, Spanish, Italian, German and Dutch. “Only a pan-European approach makes sense today, Thomas Hughes insists, to avoid balkanization of the Internet “.
Read also
Moderation, recommendations, advertising… What the Digital Services Act changes for Internet users
Five euros up front
In practical terms, a user who disagrees with a platform’s decision to delete or leave content can appeal. To avoid abusive procedures, they must pay 5 euros, which will be reimbursed if they win their case.
Users must first make their request to the platform, then fill in a form on the Appeals Centre website (www.appealscentre.eu). The DSA requires the platform to send the Centre a statement in which it justifies its decision. ” We will then evaluate the content and communicate our decision to the user, ideally within a few days.says Thomas Hughes. Ct will be in the platform’s interest to apply our decision, even if the law does not oblige it to do so. If it doesn’t, it will have to justify itself to the regulator. “.
The data, which will be anonymized, will be accessible to European regulators, including Arcom, who have been regularly consulted on the creation of the Center. Beyond the multitude of specific cases it will have to deal with, the Appeals Centre’s vocation will be broader. ” It will provide us with the first overview of the moderation landscape in Europe, says Thomas Hughes. We’ll have an overview of how these platforms work, the type of discourse they allow, the standards that need to be put in place… “.
Sensitive subject
With the possibility of taking action in the event of ” systemic risks “. ” In certain cases, if it is proven that certain algorithms are creating dangerous situations, platforms may have to change them. “says Thomas Hughes. The European Commission is likely to keep a close eye on the Center’s work. ” The strength of this mechanism is that it is quick and inexpensive, whereas conventional legal proceedings can take years. But there is no question of replacing the courts. “warns Thomas Hughes.
The DSA requires platforms to inform Internet users of the existence of this appeal mechanism. The Centre’s president expects them not to be very proactive at first. Time for them to get used to the exercise.
Moderation, which is now mostly carried out by machines on platforms, has become a sensitive issue. Thousands of decisions are contested every year by Internet users. Some social networks like X (ex-Twitter), which have reduced their teams, are accused of being among the poor performers.
Platform financing
Meta’s Oversight Board receives an average of 200,000 calls every year in Europe. Internet users challenged Facebook’s removal of photos promoting the fight against breast cancer. Images of breasts had been removed on the pretext that they constituted female nudity, in violation of Facebook’s principles. ” This is a typical example of photos that should remain on the platform, comments Thomas Hughes. Our mission will include identifying errors in the implementation of platform policies. “.
To spot these errors, which are sometimes due to cultural or linguistic differences, the Appeals Centre has recruited linguists, data scientists, specialists in ethics, human rights… In all, 25 people from various European countries will be hired by the end of the year. These numbers are set to increase as the number of platforms covered expands.
ACE will be financed by the platforms. A $15 million subsidy has been granted by Meta via its Oversight Board. For each dispute handled, the offending platform will have to pay ACE $95. A new chapter opens for the regulation of social networks.