A week after Elon Musk’s X was instructed to do the same, the European Commission has given Meta (META.O) and TikTok a week to outline their steps to stop the spread of terrorist, violent, and hate speech on their platforms.
Following Hamas’ strike against Israel more than a week ago, academics have noted a growth of misinformation, and the executive body of the European Union said on Thursday that it had requested the information from the two businesses. If the Commission is unsatisfied with the corporations’ reply, it may launch investigations into them.
Major internet platforms must do more to remove dangerous and unlawful information as per current regulations known as the Digital Services Act (DSA), which went into effect, or face fines of up to 6% of their global sales.
“Meta must provide the requested information to the Commission by 25 October 2023 for questions related to the crisis response and by 8 November 2023 on the protection of the integrity of elections,” stated the Commission.
“TikTok must provide the requested information to the Commission by 25 October 2023 for questions related to the crisis response and by 8 November 2023 on the protection of integrity of elections and minors online,” it stated.
In conclusion, the EU’s inquiry into Meta and TikTok is pivotal in addressing disinformation and illegal content. The quest for clarity and enhanced content moderation is crucial for the EU and global efforts to create a safer and more trustworthy online environment. As the investigation progresses, the world watches closely, hoping for a more transparent, secure, and accountable digital realm.

