It’s an old problem.
When the news of a major court case in New York came out, Facebook sent a flurry of updates to users who had commented on the story.
Many of them found that their posts were flagged by the company.
The problem wasn’t so much with Facebook itself, but with the company’s system for flagging comments.
The company says it has fixed the issue and that it will send more updates to its users.
However, it’s unclear if it will ever remove the feature altogether.
A recent study found that more than half of the more than 200,000 comments posted to the New York Times on Thursday were flagged as inappropriate.
Facebook has a long history of targeting users for political and social commentary, which has drawn the ire of users in the past.
The social network has faced multiple controversies over the years, but the controversy that’s sparked the most public scrutiny is the case of Mark Zuckerberg.
In 2014, Zuckerberg was accused of censoring a photo of his son that featured a black-clad woman holding a sign that read “Black Lives Matter.”
The photo sparked an international outcry, with hundreds of celebrities, artists, activists, politicians and even a number of celebrities voicing their opposition to the photo.
The photo was eventually removed from Facebook.
However the company has faced a backlash for other content that has been deemed offensive in the company, including the infamous “Hands Up Don’t Shoot” video that shows a black man shooting at police.
Facebook also received criticism for failing to remove content that contained hate speech, such as posts critical of a Muslim man and an anti-Semitic cartoon that included a Nazi symbol.