As the web has become more interactive in recent years, with website users having the ability to generate their own content, there has been increased tension between the content generated by users... Show moreAs the web has become more interactive in recent years, with website users having the ability to generate their own content, there has been increased tension between the content generated by users and the content generated by “professionals,” especially professional journalists and publishers. Many journalism professionals see the value of allowing users to comment on their work; for example, an interactive site that allows user commenting may increase users’ loyalty to the site, meaning they will return often, generate page views, and increase advertising revenue. Users can also serve to further explore an issue discussed in an article by broadening the discussion to include the viewpoints of those who aren’t professional journalists. However, publishers of news site comments have noted many problems with introducing user-generated content (UGC) to their sites, including the fact that users sometimes post inflammatory, insensitive, or lowquality comments that do not serve to improve the experience of the average reader. The problems caused by these kinds of comments have increased the amount of time and money required to moderate the comments section. Beyond inflammatory comments, however, is the problem of highlighting very good or high-quality comments. What is the degree to which those high-quality comments can be discovered through an automated process? This dissertation sought to test the extent to which comments to news sites can be automatically evaluated for quality by using a text-analysis system. Journalists were interviewed to get their views on user comments in general and comment quality in particular. The data from these interviews was used to generate hypotheses about which linguistic metrics provided by Coh-Metrix, a web-based text-analysis system, might be xiii most indicative of comment quality as described by journalists. Finally, a content analysis and close reading of a sample of news site comments was conducted in order to describe news site comments as a writing genre. Results from the interviews indicated that comment length, comment syntax, comment cohesion, comment narrativity, and comment individuality were all indicative of comment quality. However, statistical analyses on a sample of 246 comments failed to produce significant results for linguistic metrics hypothesized to be indicative of quality. Alternatively, an “positive engagement score” scale was created and used to identify how “engaging” comments were; this scale showed to have a significant, though minor, positive relationship with the number of recommendations a comment received from readers. Finally, the genre analysis of the sample of news site comments revealed that comments in the sample share a communicative purpose of providing additional content related to the article under discussion, providing a practical value for journalists and other professionals by offering tips and fact-checking functions, and providing a space where readers can debate the article in question and begin to develop a sense of community. Show less