# The Increasing Worry About Social Media Engagement Among Youth
In recent times, there has been a growing worry about the effects of social media on kids and teens. As research continues to highlight the potential dangers linked to excessive social media engagement, such as addiction and mental health challenges, lawmakers across various states and nations are taking measures. This article examines the developing child protection laws and how platforms like Discord are adjusting to these regulatory modifications.
## Child Protection Legislation
Social media platforms have faced backlash for allegedly encouraging addictive habits among teenagers and for not sufficiently enforcing age limitations. In reaction to rising public apprehension, multiple legislatures have started to put in place legally binding age requirements for social media engagement. Various states in the U.S. have proposed regulations necessitating parental consent for minors to access these platforms, while nations such as the UK and Australia have established stricter regulations for app developers concerning the safeguarding of young users.
One of the major discussions surrounding these laws is who holds the responsibility for age verification. Presently, app developers are charged with ensuring that users fulfill the age criteria laid down by law. However, prominent companies like Meta have contended that app stores, like those governed by Apple and Google, are better suited to manage age verification procedures. This perspective has gained momentum, with at least nine U.S. states contemplating legislation that would transfer the age verification obligation to these tech giants.
## Discord’s Strategy on Age Verification
In response to the recent regulations in the UK and Australia, Discord has launched a trial initiative requiring users to verify their ages. This initiative permits users to select between two methods: scanning a government-issued photo ID or completing a facial scan. Currently, this age verification system is confined to users in these two nations, but there are signs that it may expand to the U.S. if it proves successful.
Discord has specified particular scenarios in which users will be required to confirm their ages:
1. **Content Marked by the Sensitive Media Filter**: If a user comes across content that has been flagged, they may need to verify their age to continue viewing it.
2. **Adjusting Sensitive Content Filter Settings**: Users trying to modify their content filter settings may also be prompted to verify their age.
For instance, if a user has configured their account to blur potential nude images, they will need to verify their age before turning off this filter.
### Verification Options
Discord offers two methods for age verification:
– **Facial Scan**: Users can choose this option to have their age approximated via a facial scan. The estimation process takes place on the user’s device, ensuring that no data is uploaded or retained.
– **ID Scan**: Alternatively, users can scan their ID by following a set of instructions, including taking a clear picture of their identification. This scan will be uploaded for verification but will be deleted afterward.
Discord has clarified that if a user is found to be below the minimum age required to use the app in their country, they will be banned from the platform, although there is an appeals mechanism for those who believe the ruling was erroneous.
## Conclusion
As worries about the influence of social media on young users continue to escalate, platforms like Discord are taking proactive measures to comply with new regulations geared towards protecting children and teens. The adoption of age verification methods, such as facial scans and ID checks, reveals a broader trend in the tech industry to emphasize user safety and comply with legal obligations. As these changes unfold, it remains vital for parents, educators, and policymakers to engage in discussions about the responsible use of social media among young people.
With the realm of social media continuously changing, ongoing communication and collaboration between tech companies and regulatory authorities will be crucial in creating a safe online environment for the younger generation.