On Feb. 26, 2025, Instagram Reels became host to a wide range of graphic videos, particularly those involving highly violent or sexual circumstances. Most users were unable to scroll through just a couple reels before seeing upsetting material, resulting in massive backlash from the Instagram user base. While Instagram has since fixed the issue and apologized, many have called it a day that they will never forget.
So, this begs the question – Is Instagram truly safe? The younger audience on and off of social media is particularly vulnerable, especially now, not knowing if they may accidentally be exposed to highly graphic or pornographic material again.
Instagram Reels, Instagram’s short video sharing feature, is one of the most popular features provided by the app. According to data released by Meta, Instagram’s parent company, over 1.8 billion people globally use Instagram Reels (Musically). While specific statistics are impossible to calculate, a very large percentage of Instagram’s user base is under the age of 18. Instagram’s official policy states that users must be at least 13 to use the app.
Some say that the app isn’t at fault at all for harming the youth. Their argument tends to state that, while Instagram and other social media platforms should maintain moderation and protect their users from harmful content, it is ultimately the fault of a minor’s parents for allowing their child to access content that could be potentially dangerous to view.
Others will argue the opposite way – Instagram is indeed at fault for exposing minors to highly sensitive content and should not only focus on solving the issue entirely but specifically try to prevent and protect minors from viewing such things. Proponents of this view may point to Instagram permitting of younger individuals to register for the app – citing that this is what allows for the issue to arise to begin with, and not specific parent decisions.
Meta may have published a formal apology but has not mentioned any change to policy/privacy. Even still, here is what they should or could do to protect minors:
- Raise the minimum user age. Despite many minors using fake ages when creating their accounts, it would still help to raise the user age. The current age of 13- which is also the age used by many other social media platforms – is quite low.
- Remove the content altogether. Instagram currently permits a lot of overly violent content – most of which quite literally falls under the gore category – to remain on the platform. The same can be said for some pornographic content as well. Not only is this content harmful to minors, but it’s harmful to individuals of all ages. While some sensitive content should stay for informational purposes, lots of what is currently permitted has no reason to stay on the platform.
Simon • Feb 27, 2025 at 2:28 pm
Graphic content depicting real world violence has already shown to make people violent themselves, young minds are twisted easily