Facebook now has almost 2 billion users. With more users come more problems, as users have shared graphic videos across the site, especially after Facebook’s live feature released. The past months have seen live suicides, rapes and the confession of a man who posted himself gunning down a man in Cleveland.
Mark Zuckerberg said in a Facebook post Wednesday that Facebook is hiring 3,000 new workers to its community operations team. The team is responsible for fielding reports from users who flag inappropriate content on the social network. The new hires “will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation,” he said.
“The team is responsible for fielding reports from users who flag inappropriate content.”
Facebook will continue working with community groups and law enforcement to help those seen in or posting the videos who may need help. Facebook is currently developing artificial intelligence tools to look for inappropriate videos, but Zuckerberg thinks the technology is years from maturity.
Additionally, artificial intelligence can’t yet understand the context surrounding a video. A Facebook post from July explains: “For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video.” Throw in other complications like government censorship, and artificial intelligence has a lot of potential for error.
Although this is good news, it won’t solve the problem. The issue of users posting inappropriate videos in the first place, as well as the requirement that users report the inappropriate content, will still exist. Automated systems like Youtube’s content ID or databases of graphic material can do little to stop new or altered material. Right now, Facebook can only shorten its response time.