Instagram has been working on a bunch of features lately that focus on making the platform more secure. For instance, it rolled out the compulsory 13+ age requirement to create an account, it’s testing hiding likes globally, and it also recently removed the ‘following’ tab that is a popular stalking tool. Now, Instagram has announced a new feature that will notify users if their caption is offensive.
The feature will essentially use an AI algorithm that can “recognize different forms of bullying on Instagram”. When the AI detects a caption on a photo or video that could possibly be considered offensive, it gives them a warning about it so that the user could reconsider their words before posting.
“In addition to limiting the reach of bullying, this warning helps educate people on what we don’t allow on Instagram, and when an account may be at risk of breaking our rules,” Instagram writes in its blog.