I should have known better.
After the stunning sales of the previous Liblog Landscape books, I should have just let it be.
Instead, I did a comprehensive study: every English-language liblog that was discoverable on the web in mid-2010. Thirteen hundred and four of them. Plus another thirteen hundred and twentyseven “things” that I looked at, but didn’t qualify in the end, including 306 that had disappeared entirely or now required passwords to read, 118 that had been renamed (and are actually part of the 1,304), a dozen begun later than May 31, 2010, and things that either aren’t blogs at all or are blogs that appeared in liblog blogrolls but weren’t liblogs.
On the other hand, while amassing information on an absurdly broad range of liblogs, I didn’t get too crazy: I didn’t write profiles for individual blogs. I didn’t attempt to break down blogs by blogger affiliation. And, gulp, I did determine a lot of stuff about each liblog (with the percentage of blogs for which I got the information in parentheses):
- Country in which the blogger resided when the blog was checked (for 93% of the blogs)
- Blog software used, if one of seven possibilities (96%)
- Google Page Rank—I don’t seem to have a way to get this any more, but could back then (81%)
- Year and month of the first post I could locate (100%)
- Longevity of the blog in months through May 31, 2010. (100%)
- Currency: how current the most recent post was as of May 31, 2010 (99+%)
- Total posts through May 31, 2010 where it was plausible to get that figure (91%)
- Count, length, and comments for each of four three-month periods (March-May 2007, 2008, 2009, 2010)—I wasn’t stupid enough to try to capture all the posts, but did some pretty large samples. Blogs with countable posts—which, of course, also requires that the blog existed during that period—range from 52% (2007: 36% were younger) to 67% (2009). For those blogs with countable posts, ones for which length could be calculated ranged from 87% (736 blogs in 2010) to 92% (746 blogs in 2008). Blogs with posts that had countable comments ranged from 72% in 2010 to 81% in 2007.
I also divided blogs into three types—book and other reviews, technology, and everything else—and four groups based on Google Page Rank and level of posting during March-May 2010. There were 115 review blogs, 405 mostly-technology blogs and 784 others. Groups included 443 “core blogs,” 207 “less active visible blogs,” 364 “also alive” blogs and 290 “mostly defunct” blogs.
The 237-page book didn’t profile any liblogs (I was going to do that piecemeal as copies were sold, but gave up after sales were too slow to justify the effort), but had loads of tables and graphs on various aspects of measured performance and characteristics, with lists of the standout blogs in each area.
I dunno. It might have made a good thesis. Looking at the book now—my own copy is, I believe, one of eleven total copies—makes me tired just thinking about the hundreds of hours of work that went into this. The library field collectively didn’t even yawn, and maybe that was appropriate. I honestly believed that these books were worthwhile for library schools, and if I’d sold 45 copies of this one, I’d have been delighted. That didn’t happen.
Here’s what I find doing a quick revenue report from Lulu since 2008, looking only at the Liblog books and ignoring a handful of copies of one of them that might have sold via CreateSpace:
The Liblog Landscape 2007-2008: 54 copies
But Still They Blog: 24 copies
The Liblog Landscape 2007-2010: 11 copies
Chapters 2 and 3 appeared in Cites & Insights. Had there been visible sales, more chapters would have appeared there.
I have to admit: the research projects I’ve done since then have been considerably more substantial, if sometimes not as much fun.
After this series, I stopped doing self-published books for a while…or at least writing self-published books. That was a sensible move.
Crawford, Walt. The Liblog Landscape 2007-2010 (pbk.)