Last month, Instagram introduced new features aimed at protecting teenage users. The “teen accounts,” which will automatically enable upon sign-up and to existing profiles, will be private, enforce messaging and content restrictions and filter out offensive words and phrases in comments and message requests. To change these settings, parental supervision must be enabled and approved.
Child safety has been debated since the rise of the internet and discussions have only increased as social media grows in popularity. However, increased restrictions do not ensure kids are kept safe in digital spaces.
Companies and governing bodies alike struggle to find a balance between freedom and safety. Legislations protecting children exist, but they are easy to bypass. It is far too simple for children to pretend to meet the age requirements to access a website or use the internet without their parents knowing.
Stricter limitations come at the risk of being unconstitutional. Violating the First Amendment is an argument often cited in courtrooms, making it difficult for the government to further any legislation.
Concerns regarding teenagers on social media platforms are understandable. Young users’ curiosity and unpolished literacy may hinder them from thinking critically about what they see. Navigating inappropriate content, interactions with others and internalizing common messages surrounding body image threaten their ability to use social media healthily and may negatively impact their mental health.
However, writing off social media as entirely evil is a misguided and misinformed decision. Excessive online usage can worsen teens’ mental health, but it is not the cause of it. Offline factors such as home life, exposure to harm and identity exploration are more likely to lead teenagers to struggle.
Assuming all content is inappropriate is wrong as well. Social media allows people to learn about current events and discover a wealth of new information. For teenagers whose worlds are limited to their sometimes restricted environment, this can be eye-opening.
Community building and social interactions – the core of these apps – benefit young people. Marginalized kids, especially those who cannot safely express themselves, have the opportunity to form support systems and explore their identity. These interactions affirm that teenagers are not alone and that there is more beyond the close-mindedness of their offline reality.
Instagram’s proposed solution assumes all parents are fit to govern their children’s accounts, but this is untrue. Social media usage is extremely nuanced and older generations often do not understand that. The generally negative attitude towards children online overshadows its usefulness.
Granting parents excessive access to teen accounts further harms those in unsafe households. Allowing parents to see viewed topics puts teens at risk of having interests or aspects of their identity forcefully outed. The escape and community that was once found on the site will be taken away.
Implementing strict “sensitive content” restrictions also calls for concerns. This vague umbrella term can potentially block content related to bodily autonomy and functions, such as menstruation education, as well as LGBT+ posts that are notoriously flagged as inappropriate.
Social media companies should strive to protect young users, but Instagram’s rollout is deeply flawed. It does little to improve the larger issue of teenagers knowing how to protect themselves in online spaces.
Social media is not going away, and it is unproductive to hide teenagers from it. A more efficient approach would be to increase online literacy. Teaching teenagers how to filter out unwanted content, identify misinformation and protect their privacy will yield long-term, positive results.
Anaya Baxter is an integrated communications junior who can be reached at opinion@thedailycougar.com
—
“New Instagram features will not keep children safe online” was originally posted on The Cougar