Instagram has received a lot of slaps in the past for deactivating accounts without any prior notice. Now, the Facebook-owned company has changed its policies and has decided to start warning users before disabling their accounts.
Henceforth, Instagram users will receive in-app notifications when their accounts are at the risk of getting disabled for violating the community guidelines about nudity, pornography, bullying, harassment, hate speech, drug sales, and terrorism.
Users will also get a chance to appeal deleted content and disabled accounts from within the app instead of having to go through the Help Center.
Instagram also said that in addition to removing accounts with a certain percentage of violating content, it will “remove accounts with a certain number of violations within a window of time”; similar to how policies are enforced by its parent company Facebook.
Last week, Instagram rolled out AI-powered anti-bullying features, and with today’s update, the company aims to keep its platform “a safe and supportive place.”