I just finished analyzing journal 2,200 (of 4,018) in the alphabetic spreadsheet of journals that:
- are in DOAJ as of May 2015
- didn’t match my master spreadsheet of journals (grades A-D using the old grades) included in the interim Fuller Report and the ongoing series of subject posts, either by a URL match or a title match
- aren’t listed as having a 2015 start date
(If you’re wondering, 2,200 of 4,018 gets me into the Ks–past the swamp of International Journals and slightly less explosive swamp of Journals of)
Seems like an appropriate time to say how things are going and offer a slight update to my irritable comments about what will happen with all of this. (The last portion of this post, after the first update.)
Do note that all journals graded lower than D last time around–more than 800 of them–are being retested, so if some of the “X” numbers below seem high, they’re really not.
Using the revised grading system instituted for this pass, one in which all A & B journals–and only A&B journals–will be used for most analysis. (C journals will be noted briefly, but since they’re all journals I regard as “worth avoiding,” they won’t get a lot of play.)
A: Apparently Good
1,569 journals (of the first 2,200), including the following subgrades (noting that most A journals don’t have subgrades):
- 86 apparently ceased or canceled or merged (no articles since 2012)
- 9 apparently dying (no articles in 2014 and a “dying” pattern)
- 59 erratic (some years with few or no articles)
- 40 either on hiatus or very slow in posting or something
- 1 “oneshot”–a journal with a handful of articles in 2013, none before or since
- 61 small journals
B: Deserves some attention but still probably good
156 journals in all–just under one-tenth as many as A–including these subgrades:
- 6 with more author repetition than I’d like to see
- 16 with problematic English (where English is a primary language for the journal)
- 19 “garish or other site problems”
- 34 cases where journals highlight questionable impact factors
- 30 journals with minimal author guidelines
- 17 journals with mild peer review/turnaround/editorial board issues
- 33 journals with questionable but not clearly false claims
- 1 journal with questionable but not outrageous article titles
C: Questionable journals, probably best avoided
Thirty journals in all, including:
- 20 that appear to have APCs (fees) but don’t say what they are
- 1 with clear falsehoods on the journal site
- 2 with a mix of problems bad enough to be a red flag
- 3 with major peer review/turnaround/editorial board issues
- 1 incompetent site
- 3 journals with wholly absurd article titles
X: Journals not fully analyzed
445 journals (and “journals”) in all, including:
- 11 empty journals (no articles in 2011-2014)
- 21 cases of apparent malware, as flagged by McAfee Site Advisor or Malwarebytes
- 89 non-OA, including sites that require registration to read articles and, more questionably, ones that publish entirely conference/workshop proceedings
- 90 opaque or obscure: cases where I found it too difficult to figure out the article counts, mostly because there are no tables of contents, sometimes because the issues aren’t dated (in dates I can understand)
- 7 parking/ad/blog pages
- 55 cases where Chrome/Google translate didn’t yield enough info for me to analyze the journal (sometimes because the info just wasn’t there).
- 21 reachable but unusable sites (the “incompetent site” from C could be moved hereI)
- 26 journals merged into other journals with no clear way of finding the original material
- 125 unreachable, including 404s, timeouts, DNS failures and others. Some of these will be retested. Note that I now assume that if a May 2015 DOAJ URL doesn’t work either through the Excel-to-browser link or when directly pasted into the browser address bar, the journal’s unreachable or incompetent.
The really encouraging figure there is that there have, so far, only been 55 cases out of considerably more than 1,000 where Chrome/Google’s translate wasn’t enough for me to proceed. If that continues, my final spreadsheet will represent around 97% of all DOAJ journals (lacking opaque/obscure and translate-insufficient cases).
What happens with all this stuff
I believe the final spreadsheet (“final” only in terms of 2011-2014) will be the closest thing to a comprehensive picture of Gold OA during that period that we’re going to get. (I’ve read the Outsell 2013 paper, and given that Outsell is All About the Benjamins, you wouldn’t expect a comprehensive picture.)
Obviously I’m doing this work because I care about it.
Obviously I want to see the results used.
Yes, I’d like some institutional sponsorship, both for a small amount of revenue but also–perhaps more importantly–because it would provide gravitas for the results, such that maybe some of those writing Important Articles on the Future of OA and scholarly articles would actually, you know, use the best available information as background.
Since then, the number of contributors has jumped from four to…well, four.
No, I’m not going to throw the work away. Yes, I’ll certainly make some of the major findings available in posts and in Cites & Insights.
The more detailed analysis, however, which I’d think any institution that paid $2.5K for Outsell’s OA report should consider as an essential fleshing out of the story…is not going to be given away. It’s not going to cost $2.5K either; more likely, somewhere in the $50-$60 region.
Posting the updated, comprehensive, anonymized spreadsheet? Tough to justify with no additional funding. Maybe after an embargo of sorts…
We shall see. Meanwhile, if you care about this stuff, the best way to show it is to contribute $50 to Cites & Insights, which will get you a link to the interim PDF, a special link to a site where you can buy the interim paperback for $7 plus shipping…and, when it’s ready, a link to an exclusive PDF of the comprehensive 2011-2014 report with hotlinked tables of contents and figures–a version that won’t be available for sale. This offer ends the day before I announce availability of the report.