cyberbullying (191) parents (156) social networking (152) safety (144) resources (138) reputation (132) support (92) monitoring (78) Bullying (71) privacy (64) training (64) sexting (63) research (58) reports (51) texting (44) gaming (35) facebook (34) StandUp (32) reporting (25) suicide (20) app (18) harassment (18) events (17) job (2) jobs (2)

Friday, July 13, 2012

Digital Smarts Blog: The Scoop on the Facebook Reporting Tool

The Scoop on the Facebook Reporting Tool

Ever wondered what happens when you press the “Report/Mark as Spam” on your Facebook account? Recently, the company released an infographic that explains that process a little more and drew back the curtain on what happens when its users report to the site. The company has several teams that deal with different kinds of content, according to the graphic and accompanying blog post. There’s a Safety team, a Hate and Harassment team, an Abusive Content team, and an Access Team, which each deal with a specific type of reported material. The infographic doesn’t go into much detail about how the teams assess content, but does say that in some cases, teams check potential violations against Facebook’s community guidelines and, in cases where there is a credible threat of violence, will bring matters to the attention of law enforcement. Safety and security on the social network have come into sharp focus as it continues to consider plans to allow children younger than 13 onto the site. Consumer groups have already urged the site to ban ads to children 12 and under if it goes through with its plan, and others have raised concerns about the effect that cyberbullying could have on kids that young. Mashable unscientifically polled its readers on their opinions about Facebook for a younger set and found that 78 percent would not approve it, with many parents citing bullying as a concern.

No comments:

Blog Archive