حالة الطقس      أسواق عالمية

Summarize this content to 2000 words in 6 paragraphs in Arabic
Meta’s VP and Head of Global Safety, Antigone Davis, revealed the tech giant’s latest efforts to protect minors online. In an exclusive interview, she discussed everything from countering child abuse online to age verification while calling on harmonised EU digital rules.
ADVERTISEMENTBefore the launch of the Meta Youth Safety event in Brussels this Tuesday, Antigone Davis, Meta’s VP and Head of Global Safety, talked to Euronews about her vision for online child protection laws.  EU harmonised age verification system by App StoresAlthough the European Union and the United States have differing perspectives on regulating the online space, “there are areas where we have some similarities,” said Davis, who is American. One of these shared priorities is the protection of minors. As a result, Meta is urging lawmakers on both sides of the Atlantic to implement an age verification system directly at the app store level. In other words, platforms like Apple and Google would be responsible for verifying whether users are eligible to download apps like Facebook or Instagram.Davis suggested extending this system across all apps: “Implementing age verification at the operating system or app store level will help ensure that we create an ecosystem that’s safe for teens.””If we don’t address the entire ecosystem, what we’re likely to see is teens migrating from one app to another where they may not be as well protected,” she added.A draft child safety bill in Louisiana proposed a similar measure, but strong lobbying by Apple led to its rejection last September. Currently, Meta’s age verification relies on self-assessment. “If we detect activity suggesting a user may not be the age they claim, we ask for verification through an ID or a selfie-based system that estimates age,” Davis explained.In the so-called Louvain Declaration of April 2024, EU leaders pledging amongst other things to find solutions and standards across EU member states for identity and age verification. Mental health of teens and social mediaMeta’s safeguards for minors extend beyond age verification. Davis mentioned the new “teen accounts,” launched in September, which allow parents of minors to control the time spent on the app, monitor who the teen is messaging, and filter the type of content shown to their child. “In addition, we’ve turned notifications off in the evening, so after 10pm notifications are off so that teens can take a break, go to sleep, get a good night’s rest, as their parents would like,” Davis added.Davis also discussed Meta’s approach to protecting young people from political content and sensitive materials. “We do have safeguards for minors and will continue to take active measures to ensure that teens are not seeing sensitive content,” she said. She further explained that, in order to protect teens from triggers related to eating disorders, the platform no longer shows diet ads to teens.Meta came under scrutiny following a policy change that could allow homosexuality to be classified as a “mental illness”. When questioned about the potential impact this policy might have on the well-being of gay teens, Davis told Euronews it was a “top priority for the company to prevent the bullying of teens on our platform”. She clarified that Meta’s bullying and harassment policies remain unchanged for minors, and that comments can be filtered by specific words.The Commission recognised the danger of social media for the youth and Olivér Várhelyi, Commissioner for Health, said during its hearing at the European Parliament that he will “lead an EU-wide inquiry on the broader impact of social media on people’s well-being, with a special focus on children and young people”. He indicated that he will work with Henna Virkkunen, Commissioner for tech sovereignty and Glenn Micallef, Commissioner for Intergenerational Fairness and Youth – however no date has been given yet on the delivery of this study. Meta on the proposal to fight sexual abuse onlineDavis voiced her support for the latest draft regulation aimed at preventing the spread of child sexual abuse material. The proposal had been stalled for months in the Council of the EU, as member states struggled to reach a consensus on whether general scanning of messages, including encrypted communications, was necessary to combat child abuse crimes.In a bid to find common ground, the Polish presidency of the Council dropped the idea of blanket monitoring of private chats by authorities, instead suggesting that responsibility for scanning should rest with social media platforms.Davis endorsed the new draft proposal: “We welcome the deletion of detection orders because we fundamentally believe that it is possible to protect people’s safety in messaging apps, while also keeping the messages themselves private.”More generally, she urged policymakers to work toward unified digital rules that provide consistent protections across the EU, emphasising that a harmonised approach is essential to safeguarding young people online.ADVERTISEMENTShe indicated being focused on preventing harm from happening in the first place, and encouraged users to report this kind of content so they could take action.Additional sources • Eleonora Vasques

شاركها.
© 2025 جلوب تايم لاين. جميع الحقوق محفوظة.
Exit mobile version