We have seen that Facebook and its products are not having a great time in the recent past. Ever since the Cambridge Analytica scandal broke, Facebook has been caught up in one controversy after the other. This time, the company has come out clean on one of its controversial ‘features’. This is regarding Facebook’s Instagram which is known to be recommending slightly offensive posts to users.
Facebook has announced a major campaign where the company has decided to regulate content on two of its major platforms, Messenger and Instagram. This campaign is given the name ‘reduce, reuse, inform’ by the company and tries to remove objectionable content from the platform.
Facebook says that this campaign is for “removing content that violates [the company’s] policies, reducing the spread of problematic content that does not violate [Facebook’s] policies, and informing people with additional information so they can choose what to click, read or share”. The company says that the campaign will help to “manage problematic content”.
As part of Instagram’s campaign, Facebook says that they are “working to ensure that the content [recommended] to people is both safe and appropriate for the community”. As a part of this campaign, Instagram has even updated its community guidelines. Its guidelines now reflect that it will limit the exposure of posts it considers inappropriate by not recommending them in the Explore or hashtag pages.
However, we are not sure what Instagram flags as “inappropriate” content. But TechCrunch reports that Instagram has set guidelines which say that “violent, graphic/shocking, sexually suggestive, misinformation and spam content can be deemed ‘non-recommendable’” or inappropriate
Most importantly, this means that if a post is sexually suggestive but it does not contain any nudity then also it will be demoted from Instagram’s recommendations. However, Instagram says that such posts can still be seen by the account’s followers in their feed.