Social Media
4
minutos de leitura

Understanding the Digital Services Act (DSA) and Its Implications for Online Platforms

Publicado em
October 18, 2024
A digital chip connected by a web of glowing circuits, symbolizing the power of interconnected technology driving innovation and progress.
TABELA DE CONTEÚDO
Comece com a melhor ferramenta de discussão
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

The Digital Services Act (DSA) represents a landmark regulation aimed at making the digital space safer, more transparent, and accountable. As it applies to online platforms operating in the EU, the DSA introduces obligations such as enhanced content moderation, reporting mechanisms for illegal content, and restrictions on targeted advertising, particularly for minors. This comprehensive framework impacts platforms of all sizes, from hosting services to large platforms like Amazon and TikTok. Non-compliance can lead to severe penalties, including fines of up to 6% of global revenue.

A) What Does the DSA Mean for Your Business?


For online platforms, the DSA requires implementing tools that empower users with reporting mechanisms, transparency regarding terms of service, and non-targeted ads for minors. Hosting services, marketplaces, and intermediary services must also meet specific obligations, from having clear terms of service in plain language to external auditing for larger platforms. Platforms must clearly explain content moderation decisions and offer users the ability to appeal content removals.

Key Responsibilities for Platforms:

  • User Rights: Users need transparent explanations for content recommendations and must have the option to refuse personalized suggestions.
  • Ad Restrictions: Sensitive user data, such as ethnicity and political views, cannot be used for targeted advertising. Children are off-limits for profiling in ads.
  • Content Moderation: Platforms are responsible for removing illegal content quickly while balancing users’ freedom of expression.
  • Transparency: Clear labeling of ads and explanations for content removals will be essential. Users must be informed why content is flagged or removed.
  • Risk Management: Platforms need to manage risks like disinformation and create appropriate mitigation measures.

B) The Impact on Large Platforms (VLOPs)

Very Large Online Platforms (VLOPs), with more than 45 million users, face additional responsibilities. These include risk management and data-sharing requirements with authorities. The DSA lists 19 VLOPs and Very Large Online Search Engines (VLOSEs), such as Amazon, Google, and Facebook, which are required to comply with more stringent obligations. For these platforms, transparency in algorithmic recommendations and independent auditing will be crucial to building user trust and avoiding legal penalties.

C) Challenges for Businesses

The DSA raises several compliance challenges. Platforms must evaluate their content moderation practices and ensure localization, as they will need to consider the legal and linguistic nuances of all 27 EU countries. Large platforms, in particular, will face obligations for auditing and crisis response mechanisms. Ensuring alignment with DSA requirements will involve a blend of automated tools and human moderation to navigate these complex standards effectively.

D) Preparing for Compliance

Businesses have until March 6, 2024, to fully comply with the DSA. For those platforms that breach its regulations, fines could amount to as much as 6% of global turnover. Compliance will require adopting advanced content moderation tools, which can help meet the multilingual demands of EU regulations, manage risk, and flag illegal content efficiently. By incorporating these measures early, companies can not only avoid penalties but also foster greater user trust.

E) Content Moderation: A Cornerstone of Compliance

Effective content moderation is essential for compliance with the DSA. Platforms must implement robust mechanisms to enable users to report illegal or harmful content and provide swift resolutions. Platforms like X (formerly Twitter), Pinterest, and Facebook have already started implementing enhanced moderation tools. Ensuring users can appeal content moderation decisions will also be vital to maintain fairness and transparency.

F) Turning Compliance into Opportunity

Rather than viewing the DSA as a regulatory burden, businesses should treat it as an opportunity to differentiate themselves in a crowded market by building user trust. With transparency and user empowerment at its core, the DSA presents a framework that prioritizes user protection, leading to a safer, more accountable digital environment.

G) The Road Ahead for Digital Platforms

With the DSA set to become law in January 2024, it’s crucial for online platforms to start preparing for compliance. The new obligations—ranging from content moderation to ad transparency—are designed to protect users, enhance trust, and foster a safer digital economy. Businesses that embrace these changes will not only ensure compliance but also stand out as trustworthy digital service providers.