Quantcast

Deleting A Post Before Publishing Won’t Stop Facebook From Seeing It

December 18, 2013

redOrbit Staff & Wire Reports – Your Universe Online

One-third of all Facebook status updates are deleted before being published due to self-censorship, but Facebook can still see these updates and comments even if they are never posted.

The discovery was confirmed last week when two Facebook researchers released the results of a self-censorship study, which revealed they had tracked the activity of millions of random Facebook users in the US and Britain over a 17-day period in July of 2012.

The researchers – Adam Kramer, a data scientist at Facebook, and Sauvik Das, a summer Facebook intern – monitored 3.9 million users, and tracked when people began writing messages they later deleted without posting. The posts were only counted if at least five characters were entered, and were considered censored if they had not been posted within ten minutes.

Kramer and Das found that 71 percent of the monitored Facebook members censored themselves at some point, with 51 percent deleting at least one post over the 17-day period, choosing to leave an average of 4.52 posts unpublished. Another 44 percent censored their comments, with an average of 3.2 going unpublished. In total, 33 percent of all updates that are started on Facebook, and 13 percent of all comments, go unpublished, suggesting self-censorship is a “common practice” on the social network, the researchers said.

The study also found that 15 percent of the monitored Facebook users who begin writing comments on a friend’s picture decided to delete the post. Males censored themselves more often than women, and older users tended to censor themselves less than younger users.

Das and Kramer also observed the users’ demographic information, “behavioral features,” and data on each user’s “social graph,” such as the average number of friends of friends or the user’s “political ideology.” They used this information to study three cross sections with self-censorship – a user’s political stance, how that stance differed from the audience, and the user’s gender related to the gender diversity of their network.

The findings showed that users are more likely to self-censor when they believe their audience is hard to define, but are less likely to self-censor a comment on someone else’s post because that audience appeared to be better-defined.

Das and Kramer suggested that people were unwilling to “diverge from perceived social norms” because of a fear of “spamming” friends and family with “uninteresting or unnecessary content”.

Facebook said it hopes to use the study’s findings to improve the social network, perhaps by allowing only targeted friends to view certain comments and updates. The researchers noted that the Google+ social network already lets users create “circles” of contacts with whom only certain updates are shared.

The average age of the Facebook users monitored in the current study was 30.9 years old, and the average time on Facebook was 1,386 days.

Facebook said that while it monitored when text was entered by a user, it did not track what the text contained.


Source: redOrbit Staff & Wire Reports - Your Universe Online



comments powered by Disqus