Interesting academic papers about comment sections


#24

Another here on the impact of anonymity tested on TechCrunch comments in 2011, comparing Disqus and Facebook: http://www.cs.pomona.edu/~sara/Site/Publications_files/anonymitySood.pdf

Momernick, E. & Sood, S.O. (2013) The Impact of Anonymity on Online Communities, Social Computing (SocialCom), 2013 International Conference on Social Computing


#25

Here’s a paper analyzing 126 submissions on how to improve comment spaces on news articles: FROM PUBLIC SPACES TO PUBLIC SPHERE by Rodrigo Zamith & Seth C. Lewis.

“Four main themes emerged in the submissions: a need to (1) better organize content, (2) moderate content more effectively, (3) unite disjointed discourse, and (4) increase participation while promoting diversity. We find in these proposed solutions the possibility for relatively low-cost, easy-to-build systems that could moderate comments more efficiently while also facilitating more civil, cohesive, and diverse discourse; however, we also find the lingering danger of designing new systems that could perpetuate old problems such as fragmentation, filter bubbles, and homogenization.”

https://apps.cla.umn.edu/directory/items/publication/324159.pdf


#26

In search of online deliberation: Towards a new method for examining the quality of online discussions https://www.academia.edu/7404550/In_search_of_online_deliberation_Towards_a_new_method_for_examining_the_quality_of_online_discussions


#27

Plenty of great papers cited in our blogpost on the real name fallacy by @natematias https://blog.coralproject.net/the-real-name-fallacy/


#28

Oh hey how about our own paper with the Engaging News Project at UT-Austin?

Comment Section Survey Across 20 News Sites, Natalie Jomini Stroud, Emily Van Duyn, Alexis Alizor, Alishan Alibhai, & Cameron Lang

https://engagingnewsproject.org/enp_prod/wp-content/uploads/2017/01/Comment-Section-Survey-Across-20-News-Sites.pdf


#29

Slovenia has a law that makes the editor-in-chief legally responsible for comments on their site. (More on that here: Shutting down onsite comments: a comprehensive list of all news organisations ) Here’s a paper about how that could be challenged by European human rights laws:

http://zdjp.si/wp-content/uploads/2015/08/ceferin-meznar.pdf


#30

Anika Gupta’s Masters thesis from MIT, Towards a better inclusivity : online comments and community at news organizations

https://dspace.mit.edu/handle/1721.1/104258


#31

Anyone Can Become a Troll: Causes of Trolling Behavior in Online Discussions Justin Cheng, Michael Bernstein, Cristian Danescu-Niculescu-Mizil, Jure Leskovec


#32

The New Governors: The People, Rules, and Processes Governing Online Speech Kate Klonick, Harvard Law Review
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2937985

This one is about moderation practices at Twitter, Facebook, and YouTube. Also contains a good summary of the history of Section 230 of the CDA, as well as the history of moderation on those platforms.


#33

What we talk about when we talk about talking: Ethos at work in an online community, PhD thesis by Quinn Warnick, Iowa State University 2010

  • A virtual ethnography of Metafilter

http://plus.quinnwarnick.com/warnick-dissertation.pdf


#34

The Bag of Communities: Identifying Abusive Behavior Online with Preexisting Internet Data
A smart way to overcome the frequent issue of lack of training data for your machine learning system.
https://dl.acm.org/citation.cfm?id=3026018&CFID=760262153&CFTOKEN=48957134

Conflict in Comments: Learning but Lowering Perceptions, with Limits
Participants reported learning most from comments containing
constructive conflict.
http://dl.acm.org/citation.cfm?id=3025902


#35

“This is a Throwaway Account”: Temporary Technical Identities and Perceptions of Anonymity in a Massive Online Community
Alex Leavitt, Annenberg School for Communication & Journalism, University of Southern California
http://alexleavitt.com/papers/2015_CSCW_Leavitt_ThisIsAThrowawayAccount_AnonymityReddit.pdf


#36

Constructing the cyber-troll: Psychopathy, sadism, and empathy
Natalie Sesta and Evita March, Federation University, School of Health Science and Psychology, Australia

Highlights
• Trolling is an online antisocial behaviour with negative psychological outcomes.
• Current study predicted trolling perpetration from gender and personality.
• Trolls more likely to be male with high levels of trait psychopathy and sadism
• Trolls have lower affective empathy, and psychopathy moderates cognitive empathy.
• Results have implications for establishing education and prevention programs.


#37

A low standard of comments increases the reading by those who already participate but “non-users are even more frustrated by the low quality of discussions. They consider such participation activities to be a waste of time, and are not willing to register.”
User comments: motives and inhibitors to write and read
Nina Springer, Ines Engelmann & Christian Pfaffinger
http://www.tandfonline.com/doi/full/10.1080/1369118X.2014.997268?src=recsys


Discussion of how comment incivility can affect how reliable you find the article:
The “Nasty Effect:” Online Incivility and Risk Perceptions of Emerging Technologies
Ashley A. Anderson, Dominique Brossard, Dietram A. Scheufele, Michael A. Xenos, Peter Ladwig
http://onlinelibrary.wiley.com/doi/10.1111/jcc4.12009/full

Suggests that frequent commenters are more civil than occasional ones:
Online and Uncivil? Patterns and Determinants of Incivility in Newspaper Website Comments
Kevin Coe, Kate Kenski, Stephen A. Rains
http://onlinelibrary.wiley.com/doi/10.1111/jcom.12104/abstract

How incivility reduces trust in the information:
The Role of Civility and Anonymity on Perceptions of Online Comments
Joseph Graf, Joseph Erba & Ren-Whei Harn
http://www.tandfonline.com/doi/abs/10.1080/15205436.2016.1274763

Have comments become so bad that their mere presence makes people trust journalism less?Effects of civility and reasoning in user comments on perceived journalistic quality
Fabian Prochazka, Patrick Weber & Wolfgang Schweiger
http://www.tandfonline.com/doi/full/10.1080/1461670X.2016.1161497?src=recsys


#38

A paper from Danielle Citron and Benjamin Wittes suggesting a revision to the Communications Decency Act section 230, the part of U.S. law that protects American companies from being found liable for the contents of user-submitted comments.

They suggest dividing websites into Good Samaritans (who make efforts to remove harmful content, and should receive immunity through the law) and Bad Samaritans (who don’t)

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3007720


#39

Outnumbered But Well-Spoken: Female Commenters in the New York Times
By Emma Pierson
Department of Statistics, Oxford University & Stanford University
CSCW '15 Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing
Vancouver, BC, Canada — March 14 - 18, 2015

Using eight months of online comments on New York Times articles, we find that only 28% of commenters of identifiable gender are female, but that their comments receive more recommendations from other readers…We discuss the implications of these gender differences for democratic discourse and suggest ways to increase gender parity.

Citation: https://dl.acm.org/citation.cfm?id=2675134
Paper http://cs.stanford.edu/people/emmap1/cscw_paper.pdf


#40

Discussion quality diffuses in the digital public square
By George Berry, Cornell University and Sean J. Taylor, Facebook

A paper on sorting comments. It’s about how Facebook compared ‘Most recent’ comments with ‘Social Feedback’ comments (Most Liked, etc) to see how people engaged with each.


#41

http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf

You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech

ESHWAR CHANDRASEKHARAN, Georgia Institute of Technology
UMASHANTHI PAVALANATHAN, Georgia Institute of Technology
ANIRUDH SRINIVASAN, Georgia Institute of Technology
ADAM GLYNN, Emory University
JACOB EISENSTEIN, Georgia Institute of Technology
ERIC GILBERT, University of Michigan

This paper looks at the success of Reddit’s closing of communities filled with hate speech in reducing hate speech across the platform.

Key points:

The empirical work in this paper suggests that when narrowly applied to small, specific groups, banning deviant hate groups can work to reduce and contain the behavior.

The banning of r/fatpeoplehate and r/CoonTown led to the rise of alternatives on Voat.co, for example, where the core group of users from Reddit reorganized. For instance, in another ongoing study, we observed that 1,536 r/fatpeoplehate users have exact match usernames on Voat.co. The users of the Voat equivalents of the two banned subreddits continue to engage in racism and fat-shaming [22, 45]. In a sense, Reddit has made these users (from banned subreddits) someone else’s problem. To be clear, from a macro persepctive, Reddit’s actions likely did not make the internet safer or less hateful. One possible interpretation, given the evidence at hand, is that the ban drove the users from these banned subreddits to darker corners of the internet.


#42

http://www.lindsayblackwell.net/wp-content/uploads/2017/11/Classification-and-Its-Consequences-for-Online-Harassment-2017.pdf

Classification and Its Consequences for Online Harassment: Design Insights from HeartMob

LINDSAY BLACKWELL, University of Michigan School of Information
JILL DIMOND, Sassafras Tech Collective
SARITA SCHOENEBECK, University of Michigan School of Information
CLIFF LAMPE, University of Michigan School of Information

We examine systems of classification enacted by technical systems, platform policies, and users to demonstrate how 1) labeling serves to validate (or invalidate) harassment experiences; 2) labeling motivates bystanders to provide support; and 3) labeling content as harassment is critical for surfacing community norms around appropriate user behavior… we argue that fully addressing online harassment requires the ongoing integration of vulnerable users’ needs into the design and moderation of online platforms.


#43

Not Funny? The Effects of Factual Versus Sarcastic Journalistic Responses to Uncivil User Comments
Marc Ziegele, Pablo B. Jost

Incivility in user comments on news websites has been discussed as a significant problem of online participation. Previous research suggests that news outlets should tackle this problem by interactively moderating uncivil postings and asking their authors to discuss more civilized. We argue that this kind of interactive comment moderation as well as different response styles to uncivil comments (i.e., factual vs. sarcastic) differently affect observers’ evaluations of the discussion atmosphere, the credibility of the news outlet, the quality of its stories, and ultimately observers’ willingness to participate in the discussions. Results from an online experiment show that factual responses to uncivil comments indirectly increase participation rates by suggesting a deliberative discussion atmosphere. In contrast, sarcastic responses indirectly deteriorate participation rates due to a decrease in the credibility of the news outlet and the quality of its stories. Sarcastic responses however increase the entertainment value of the discussions.

http://journals.sagepub.com/doi/abs/10.1177/0093650216671854