Written by Alice Njoki.
Meta, the parent company of Instagram, has introduced important new safety tools to make the app safer for teens and children. These changes help young users understand who they are talking to, block harmful messages more easily, and reduce exposure to dangerous content.
For the teens using Instagram and the parents handling their teens Instagram account , there are new features in direct messaging. When a teen receives a message from someone new, they can now see details like the month and year the other person’s account was created. This helps teens decide if the person is trustworthy. Instagram also shows safety tips and provides a quick option to block and report users with just one tap. These changes make it easier for teens to protect themselves from scammers and bullies.
Instagram has also introduced a “Location Notice,” which warns teens if they are chatting with someone in another country. This is important because some scammers try to trick young users by pretending to be nearby when they are really far away. Another important tool is the nudity protection feature, which blurs inappropriate images in direct messages automatically. Most teens keep this feature turned on, and it helps reduce unwanted exposure to nude or sexual images.
Meta is also extending these safety protections to accounts run by adults who share content about children, such as family pages or talent managers. These accounts will have the strictest message controls and automatic filters to hide offensive comments. Instagram will also make it harder for suspicious adults to find and contact these accounts, keeping children safer from harmful interactions.
Meta has been actively removing accounts that break its rules to protect children. In recent months, the company took down hundreds of thousands of Instagram accounts that were involved in sexual exploitation of children. They also inform users when an inappropriate account interacting with their posts has been removed, encouraging people to stay alert and use blocking and reporting tools.
These updates from Meta aim to create a safer and more positive experience on Instagram for young people. Teens and children now have more tools to control their online interactions and are better protected from harmful content and people.