How is Facebook working to keep its community safe?
We're working to better protect your privacy and reduce the amount of misleading information on Facebook so that you see the stories that matter to you most.
Protecting Your Privacy
We're taking action on potential past abuse and putting stronger protections in place to prevent future abuse of our platform. We're making it easier for you to manage the apps you use and if we remove an app for misusing data, we'll tell everyone who used that app. If you haven't used an app within the last 3 months, we'll remove the app's access to your information. We're also changing the way Facebook Login works to reduce the data an app can request.
Clickbait and Spam
We're making changes to how we rank posts in News Feed so you see more stories from friends, family and the people who matter to you most. Clickbait and spam are posts that are designed to grab your attention and get you to click on links or interact with the post in a specific way. This includes headlines that mislead people or intentionally leave out important details or exaggerate information. We're working to use these signals, along with other information, to place these stories lower in News Feed.
Fake Accounts
Fake accounts are closely related to the spread of misleading content. We block millions of attempts to register fake accounts every day, but some still get through our systems. Machine learning technology helps us identify behaviors unique to fake accounts to detect and deactivate inauthentic accounts. Our priority is to remove fake accounts that have a high amount of activity and reach a lot of people in order to combat the spread of false news, spam and clickbait.
You can always report a profile if you think it doesn't represent a real person.
False News
We are committed to reducing the spread of false news on Facebook. We remove fake accounts and disrupt economic incentives for people and Pages that share misinformation. We also use signals, like feedback from our community, to identify stories that may be false. In countries where we work with independent third-party fact-checkers, stories rated as false by those fact-checkers are shown lower in News Feed. If Pages or domains repeatedly create or share misinformation, we significantly reduce their distribution and remove their advertising rights. We're also working to empower people to decide for themselves what to read, trust and share by giving them more context on stories with tools like Related Articles.
Was this information helpful?