Instagram has detailed some changes for its account disable policy today. Under its policy, it disables accounts which have a certain percentage of violating content. A new policy is now being rolled out where aside from removing the accounts with a certain percentage of violating content, it will also remove accounts that have a certain number of violations within a window of time.
The Facebook-owned company says that this change will enable it to better enforce its policies and be more consistent with it so that users can be held accountable for what they post on the photo-sharing network. This is meant to ensure that Instagram is a supportive place for everyone.
It’s also introducing a new notification process in order to help people understand if their account is on the verge of being disabled. This notification will also provide them with an opportunity to appeal content that was deleted.
Appeals will initially be available for content deleted for violations of nudity and pornography, hate speech, bullying and harassment, drug sales, and counter-terrorism policies. Appeals will be expanded to cover additional policies in the coming months. If a post is found to have been removed in error, Instagram will restore the post and remove the violation from the account’s record.