It was easy to decide not to turn my 2007 studies of public library blogs and academic library blogs into lateral studies (by doing another study a year later, or two years later, and seeing how blogging had changed)–the books didn’t do well, I concluded that my concept for them wasn’t sound (or at least didn’t excite the marketplace), and even when I published the core findings for free, readership for that issue of Cites & Insights was down.
I wondered how these 483 blogs were doing, at least at an overall level.
So I thought I’d do mini-studies. How long could that take? I already had the spreadsheets with URLs and 2007 figures. If I limited the mini-study to some simple facts–
- Whether the blog was still active
- The most recent post on or before a set date (I used May 31, 2009)–that is, the interval
- How many posts there were in May 2009
- How many comments there were on May 2009 posts
and the readily-derivable figures (changes in post frequency, changes in comment frequency, conversational intensity [comments over posts] and changes in conversational intensity)…well, I thought I could do the fact checking in five hours or so for each half and write up the results in another five hours or so. Better yet, I could add brief notes on some of the more interesting blogs (and where they were now) in an hour or two more.
That would bring a form of closure to the library blog study experiment, and maybe make it a little more worthwhile.
So that’s what I did
And, indeed, it did take about five hours to do each of the two sets of fact checks.
Writing them up? A little longer–partly because some of the formulas for derivative metrics are a little tricky in Excel. (E.g., for changes in conversational intensity, where there may be no comments in 2007 or no comments in 2009, the Excel formula involves some fancy IF nesting to avoid division by zero or other fun stuff–especially because “none in 2007 and some in 2009” should yield an entirely different result from “some in 2007 and none in 2009.”)
But I was getting there…or was I?
The URLs, they are a-changing…
I knew there would be changed URLs. Aren’t there always? If an old blog had a link to a new one, I’d follow the link (and change the spreadsheet). But what if there was no such link? Did that mean the blog had died–or that the library hadn’t bothered to link the old to the new?
I’d done some quick blog-title searches along the way, mostly yielding no results. But when I was writing comments on “intriguing” public library blogs, I was finding ones that had disappeared…only they hadn’t really, if I’d just search hard enough. And, of course, once you find a blog that wasn’t picked up in the fact-checking part, that changes the spreadsheet, which may change the quintiles, which…
Pushing toward completion
Here’s where things stand now, I believe:
- It may not be harder than it looks, but it was harder than I expected.
- The Public Library Blogs article is as good as it’s going to get. It will be in the September 2009 Cites & Insights (out some time in the next two weeks, with luck). I believe it’s pretty much correct (it’s hard to type with crossed fingers).
- The Academic Library Blogs article is “done”–but part of me wants to go back and search for some of the missing blogs a little further. Which, if I find them, would mean redoing much of the article. And, frankly, I’m not sure how much that would mean. That article will be in the September C&I also…I’m just not sure whether I need to work on it some more.
I will post both spreadsheets on my website once the issue comes out, with an open invitation for library schools or crazy people to use them as the basis for further study…ideally crediting me for the early work, but certainly not with any restrictive licensing. (Technically, this is all facts anyway…and as such, not copyrightable.)
Then I’ll look at getting back to some regular essays and deciding what to do about liblogs, where I really do want to continue studying the situation, despite the book’s relatively awful sales…
And I really do want to thank readers and others for the clear response to this post and the FriendFeed equivalent. I could easily have spent a couple of hundred hours preparing that project…and at this point, doing it “for the greater glory of librarianship” just doesn’t feel right. Increasingly, I feel as though the libraries that could use the results most (those where local book publishing projects would particularly enrich the community) wouldn’t–either because they wouldn’t know about it or because those who knew about it wouldn’t think it was worthwhile. In which case, why bother?
End of August
By the way: Public Library Blogs: 252 Examples will go offsale at the end of August 2009. If you want a copy, now’s the time to get it. (Thanks to a huge surge of sales of Academic Library Blogs: 231 Examples in July–that is, one copy sold–that one will be available at least through September 2009.)
Postscript, 8/5/09: I did go back and search for some of the missing academic library blogs…and after an hour or so of that, concluded that it wasn’t worth the effort. (I wasn’t coming up with anything new and it was taking far too long to determine that.)
No broad study of blogs can be 100% accurate, for a variety of reasons. I’ll edit and publish this mini-study as it is, possible warts and all.
Oh, and a pre-reference to the claim in the paragraph above. At least one liblogger makes a practice of deleting posts, after they’ve appeared and without notice. A study that includes that blog will show erroneous results. That’s just one example; there are many others.
Overall trends? You can get those pretty well, and I’ll argue that my studies show more about library-related blogs than anything else out there. Precise numbers? Somehow, precision and social media don’t mix very well…