A new algorithm, trained to detect harassment in photos and captions, proactively sends offending content to the Community Operations team for review.
If a human moderator deems the photo is in breach of the platform's community guidelines, the photo will be removed, and the poster will be notified of its deletion and told why.
While Instagram sees the act of curbing bullying incidents an important step, it also wants to take things further by introducing a new camera effect to spread kindness on the platform. The feature was announced earlier this year and Instagram finally started rolling it out after Adam Mosseri was appointed the leader of the photo-sharing site after its co-founders resigned citing differences with the decisions made with the features rolled out on Instagram.
The machine learning mechanism will be looking for signs of harassment in photos, such as attacks on a person's appearance or character, or threats to their well-being.
Finally, Instagram wants its users to spread kindness, and to help do that, it has launched a new camera effect that puts hearts on the screen when in selfie mode. The technology that enables such detection was surveyed a year ago by Wired, which highlighted the company's inner attempts at machine-learning comment filters. The rear camera shows "kindness" in multiple languages. You are encouraged to tag a friend to support; they in turn receive a notification, and can share it to their own story or use the filter to pay it forward.
A few months back Instagram introduced a bullying comment filter which proactively detects it and hides comments from the feed.
From today, Instagram also provides a new function with filter that filters out offensive comments - now you can enable it during online broadcasts.