OK, let’s get to the one that I was postponing for a bit, in this series of detailed notes on aspects of the 252 public library blogs covered in Public Library Blogs: 252 Examples.
To wit, how many comments appeared on posts within the 92-day study period.
That metric certainly isn’t the only measure of a blog’s success. It may not even be a particularly important one. (See this post for a thoughtful discussion of blog metrics and other assessment issues.) But one of the big selling points for library blogs does seem to be a “build it and they will come” assertion–that blogs will get the community actively involved in providing feedback.
In all, there were 1,768 comments on all of the blogs combined (after I excluded a couple of cases with dozens of comments, all of them obviously and regrettably spam). That’s an average of seven per blog–and, as I noted in the book, that’s a wildly misleading average, since nearly a quarter of those comments appeared on a single blog. Two other blogs had more than a comment a day (average); those three blogs represent 37% of the total comments for all blogs. (Four more blogs averaged more than a comment every other day, and five more averaged more than one every three days.)
The median? Zero comments. Only 118 of the 252 blogs had any comments at all, so the median is necessarily zero.
Now, to be sure, some blogs simply don’t allow comments. If a blog consists of nothing but authors and titles for new books, there’s little reason to allow comments. If a blog is the library’s home page (and some pretty impressive blogs are just that), it’s not clear that comments would work very well. In a lot of cases, it was hard to tell why comments weren’t allowed.
Here are the quintiles:
- Q1: Most comments: From five to 392 comments.
Average (mean): 32.1 comments
Median: 14 comments. - Q2: More comments: From one to five comments.
Average: 2.9 comments.
Median: 3 comments - Q3: Average number of comments: From zero to one comment.
Average: 0.3 comments per blog
Median: Zero comments. - Q4 and Q5: Zero comments.
Just for fun, I did a set of subquintiles–including only the blogs that had at least one comment. If all of those with no comments simply didn’t allow them, that might be a good metric–but, as far as I remember, fewer than half of those lacking comments had comments disabled.
- Q1: Most comments: From 15 to 392 comments.
Average (mean): 56.7 comments
Median: 30.5 comments. - Q2: More comments: From six to 14 comments
Average: 9.8 comments
Median: 9.5 comments - Q3: Average number of comments: From three to six comments
Average: 4.2 comments
Median: 4 comments - Q4: Fewer comments: From one to three comments
Average: 2.1 comments
Median: 2 comments - Q5: Fewest comments: One comment per blog, thus the average and median are both 1.
Not surprisingly, this metric fits well within the Pareto principle, only more so: 80% of the comments were in 11% of the blogs.
Incidentally: You could argue that these counts aren’t quite fair–they don’t measure the comments received during the quarter or all comments ever received on that quarter’s worth of posts. They measure all comments received on that quarter’s worth of posts when I did the analysis, which was mostly in July 2007. On the other hand, most legitimate comments on most real-world blogs appear within the first few days after the post, or at least within the first month or so. (I now automatically turn off comments after six months, because most “late” comments are actually spam.)