You’re a PPPPredator! You’re a PPPPredator! You’re ALL PPPPredators!

October 20th, 2015

I think I finally get it: what Jeffrey Beall is driving at, given his apparent standard that One Bad Article Condemns An Entire Publisher and his apparent plan to discredit each significant gold OA publisher for some reason, one at a time…

Namely, he’s too narrow but he’s right–and I’m using “PPPPredator” for “potential, possible or probable predatory publisher.”

I apologize for doubting him; I simply failed to realize the Oprahism in what he’s saying, once you remove the OA-only blinders: to wit, every publisher is a PPPPredator.

No? Consider:

  • BMC is owned by Springer Nature, so now that Beall’s pointing out one possibly-defective article in one journal from BMC–but phrasing it as an attempt to discredit BMC in general, with the tagline “This is scholarly open-access publishing,” it only makes sense to conclude that Springer Nature is a PPPPredator.
  • Frontiers just had the honor of being added to Beall’s list because…well, because Beall Gets Complaints. (But then, it’s also part of Springer Nature and we already know Springer Nature is a PPPPredator) Corrected: While Holzbrinck, owner of Nature Publishing Group (now part of Springer Nature), is a minority investor in Frontiers, NPG itself does not own Frontiers. I regret the error/oversimplification.
  • Beall’s made it clear that APC-charging journals inherently represent a conflict of interest (but apparently subscription journals with page charges don’t), and pretty much every major subscription journal publisher now has at least “hybrid” journals (it appears that a substantial majority of subscription journals from larger publishers now have “hybrid” options, at least based on Outsell’s reports), and most of them have APC-charging Gold OA journals, then Elsevier, Wiley, Taylor & Francis, SAGE, Oxford University Press, Cambridge University Press, the American Chemical Society, BMJ, RSC, IOP, American Institute of Physics…all PPPPredators. (With a bit of research, I could extend that list quite a bit…)
  • AAAS (publishers of Science)? Yep. Science Advances (even if the others aren’t “hybrid,” which I don’t know), so AAAS is a PPPPredator.
  • Hmm. That does leave gold OA publishers funded by means other than APCs, but since Beall’s already attempted to discredit SciELO and Redalyc for being “favelas,” I’m sure he has similar approaches at the ready for other serious non-APC gold publishers.

So there it is. You’re a PPPPredator! You’re a PPPPredator! You’re all PPPPredators!

(Now that I think of it, I don’t believe ALA has any APC-charging gold OA journals or hybrid journals, although it does have  both non-APC gold OA and subscription journals…but ALA’s published my writing on open access (or my “bilge” as Beall termed it in one of his never-ad-hominem remarks, which he since deleted from the comment stream in which it appeared), so they must be a PPPPredator.)

Well, that’s a relief.

Actually, it’s also true: any publisher is potentially a predatory publisher, especially when one man gets to determine what’s predatory. Pretty much every publisher will occasionally publish a “bad” paper, possibly one that some others think is “obviously” bad, possibly even one that’s plagiarized. Pretty much every publisher will have at least one journal where at some point the editorial board or peer review may involve issues (excessive publication, editorial overrides, etc.).

I need to modify some previous conclusions. As far as I can tell, somewhere between 1.4 and 2.5 million papers were published last year by PPPPredatory publishers,in the most general sense.

If you want to avoid all PPPPredatory publishers….you’ll just have to self-publish or go directly to arXiv or some other archive.

Or you could step back, take a deep breath, and look at journals using a little judgment and, for open access, the Directory of Open Access Journals, a whitelist that’s getting better and better. And maybe a little common sense. If you believe your already-paid-for scholarly research deserves the widest possible audience, there are thousands of serious gold OA journals available that don’t even charge author-side fees. (At least 6,383 of them published articles in 2014.)

Oh, and since this is my own odd little contribution to Open Access Week, let me add: If you want to know more about the realities of serious gold OA publishing from 2011 through 2014, based on a 100% “sample” of what’s out there, I’ll recommend my book The Gold OA Landscape 2011-2014, available in paperback form or as a site-licensed non-DRM PDF ebook. Every library school should have a copy; so should every serious OA publisher, at the very least. So, IMNSHO, should every ARL library.

A couple of non-footnotes:

  • What? You believe one fundamentally flawed journal is enough to discredit a publisher, even if one article isn’t? You might check into the publisher that continues to publish a “scientific” journal that presumes that water somehow has memory… Just a hint, the name begins with Els…
  • If you think I’m saying “All publishers are alike” or “There are no fundamentally defective journals” or “There are no publishers more interested in scamming money than in actual scholarship”–I’m not.
  • If you think I’m saying “Blacklists are fundamentally flawed, and any transparent blacklist would include every major publisher”–well, yes, I am.
  • NOTE: A handful of possibly-inflammatory words changed at 4:15 p.m. PDT October 20, 2015. The message stays the same.
  • And, of course, “pretty much every” does not at all mean “every,” just “many or most large and well-known.” Maybe it’s in the same truth-space as “all gold OA publishing involves APCs.” Maybe not.

The Gold OA Landscape: quick update

October 16th, 2015

The Gold OA Landscape 2011-2014 came out on September 11, 2015* (35 days ago). The free excerpted version, Cites & Insights 15.9, came out the next day (September 12, 2015).

To date (excluding most of September 30 for the C&I figures):

  • The single-column version of C&I 15.9 has been downloaded at least 1,607 times, while the two-column version has been downloaded at least 202 times. So apparently 1,809 people (or more) find the work worthwhile.
  • Other than my own copy, the paperback edition has sold five copies, the site-licensed PDF ebook has sold one copy.
  • Exactly one copy of the book has sold in the first half of October 2015; the previous sale was on September 24, so one copy of the book has sold in the last 22 days.
  • New contributions to C&I to encourage continued work, since September 11, 2015: $0.
  • New sources of funding: Zero.

It’s early yet, but the nearly complete lack of activity is not encouraging. (The single copy sold in October was purchased from Lulu’s German outpost.) I’m roughly one-sixth of the way toward making the data freely available and one-tenth of the way toward considering continued research.

(There are no copies in Worldcat so far. That’s not surprising. It is gratifying to see that Open Access: What You Need to Know Now shows 1,216 copies in Maybe I should have quit while I was ahead…)

*Technically, the ebook came out on September 10, but didn’t yet have the explicit site license statement.

Frontiers: Get the numbers right

October 15th, 2015

On October 13, 2015, Frontiers posted a piece on its blog, “Frontiers’ financial commitment to open access publishing.” It’s sort of an effort to be transparent about the OA publisher’s finances, although “sort of” may be the right qualifier, as it lumps all the publishing-related stuff into one $6.8 million chunk (only 34% of total spending).

But I’m not commenting on the piece in general; at least, it’s more data than we have for a lot of other players in scholarly journals. Instead, I’m concerned about the second paragraph, because I believe it at tends to undermine the remainder of the post.

I have a slight concern about the very first sentence in the post as well, to wit “In 2014, the annual cost of traditional, subscription-based scholarly journal publishing was $14 Billion” Suddenly seeing total spending on traditional scholarly journals (this isn’t cost, this is price) jump by 40% is rather startling. But at least that figure is sourced, sort of: apparently it involves combining two different reports, one of them published before the year it supposedly covers. But I’ll leave that for somebody else to deal with.

Here’s the paragraph in question:

Open Access does away with subscriptions to allow any reader in the world unrestricted access to scholarly articles. To provide this option, Open Access publishers directly charge the authors an Article Publishing Charge (APC), which authors typically pay from their grants or receive institutional support to cover the cost. The APC generally ranges from $500 to $6,000 with an industry average of around $3,000. Often people wonder “Where does this money go?”

The first sentence is just fine. The last sentence is just fine.

But those middle two sentences:

To provide this option, Open Access publishers directly charge the authors an Article Publishing Charge (APC), which authors typically pay from their grants or receive institutional support to cover the cost.

Can y’all repeat with me the old refrain? Most OA journals do not have Article Publishing Charges. And yet this sentence doesn’t say “most Open Access publishers” (which would be false) or “the OA publishers publishing the most articles” (which would be true for some fields and as a whole, but false for others). Nope. It’s an unqualified “Open Access publishers.” Which is convenient, of course, if you’re an APC-charging OA publisher…

I’ll give you two real numbers–one that’s Published by a Reputable Publisher but covers only about two-thirds of serious gold OA journals (that is, the journals in DOAJ), another that covers nearly all serious gold OA journals but is not Published by a Reputable Publisher, instead being self-published by, well, me.

Published by a Reputable Publisher*:

67% of the gold OA journals that are accessible to English-speaking readers, reachable on the web, and actually published articles between January 1, 2011 and June 30, 2014 do not charge article processing charges. (Biomed is the only broad segment in which a slight majority of journals–53%–do have APCs.)

Published by Walt Crawford with transparent methodology:

Of the 9,512 gold OA journals that don’t raise warning flags and published articles between 2011 and 2014, 74% do not charge APCs. (A higher percentage of non-English journals are free to authors.) Of the 8,760 that actually published articles in 2014, 72.9% do not charge APCs. In 2014, 42.8% of articles in gold OA journals were in journals that don’t charge APCs. Even in biomed, a majority of journals (56.2%) do not charge APCs.

Now, let’s look at the second of the two offending sentences:

The APC generally ranges from $500 to $6,000 with an industry average of around $3,000.

As with the first sentence, this one’s not sourced, but here are some real numbers–again, one covering two-thirds of serious gold OA journals (those at least partly accessible to English-speaking people) and Published by a Reputable Publisher, and one covering nearly all serious gold OA journals:

Published by a Reputable Publisher*:

Fees range from $8 to $5,000. The average is such a silly figure that it wasn’t published, but only 12% of APC-charging journals charge $2,000 or more and only 16 out of 2,064 charge $3,000 or more. So an “industry average” of $3,000 must define the “industry” to include only the 30 most expensive journals. (42% of the journals charge less than $450, for what that’s worth.)

Based on article counts, the average APC per article in APC-charging journals is $1,045; the average across all articles is $630. Even for biomed, the average across APC-charging journals is $1,460, a far cry from $3,000.

Even in biomed, a minority of APC-charging journals charge $1,451 or more.

Published by Walt Crawford with transparent methodology:

I didn’t provide top and bottom figures because they’re not very meaningful, but the top quadrant of APC-charging journals (that is, the 25% with highest APCs) begins at $1,420 and the second quadrant begins at $600, so the average is somewhere in the $600 range. (Even in biomed, only 520 out of 1,365 APC-charging journals charges $1,420 or more.)

Based on article counts, the average charge per article for APC-charging journals in biomed was $949 in 2014.

Let’s look directly at the spreadsheet (not published, and in this case I’ll even include a few hundred journals that seem to be sketchy–those graded C–for a total of 9,824 journals):

  • The highest APC in 2014 was still $5,000, with only one journal at that level, but among this broader group of journals, there are 28 charging $3,000 or more–that is, 28 out of 2,619 APC-charging journals.
  • The low is now $2 (in U.S. dollars), not $8.
  • The average APC is $830.
  • The median APC is $600, as you’d expect.

Ah, but those numbers include a few journals that I regard as sketchy. So let’s look at just the 9,512 journals that appear to be good:

  • The high is still $5,000, with 28 journals out of 2,470 charging $3,000 or more. (That’s just a little over 1%–an odd version of “average.”)
  • The low is still $2.
  • The average APC per journal, a very silly figure, is now $842.
  • The median APC is…still $600.

I suppose you could inflate that “average APC” a lot by including so-called “hybrid” journals, which tend to charge extremely high fees for the handful of suckers wealthy authors/funding agencies that pay to (possibly) make their articles open while shoring up the $10 (or $14?) billion subscription marketplace. But by my reading, it’s a bad set of numbers (with no sources provided).

*Open-Access Journals: Idealism and Opportunism, published as the August/September 2015 issue of Library Technology Reports, an imprint of the American Library Association.

The Gold OA Landscape 2011-2014: Medicine

October 12th, 2015

Another in an intermittent series of posts encouraging folks to buy The Gold OA Landscape 2011-2014, in part by noting what’s not in the excerpted Cites & Insights version.

Chapter 10 is Medicine–which probably should be broken into, say, half a dozen subsets, but I don’t know enough to make that breakdown. It’s by far the largest subject, as noted in the excerpted version.

A few items from the book’s coverage:

  • While a majority of articles published in serious gold OA journals in 2013 and 2014 involve APCs, a majority of those in 2011 and 2012 were in no-fee journals.
  • Fee-based articles have more than doubled since 2011.
  • 44% of articles involving APCs appeared in journals within the most expensive segment, $1,960 and up–and the average for articles involving APCs was $1,446 per article ($854 per article overall).
  • There does seem to be a gold rush of APC-charging journals starting in 2007 and peaking in 2009-2010.
  • You can probably guess the two countries publishing the most medicine articles in 2014, but maybe not the order: UK first, US second. Iran is sixth. For the rest of the 22 countries with at least 1,000 articles, see the book.

Much, much more in the book. Worthwhile for your library or if you’re seriously interested in OA. If enough copies sell (no change in the last week), the anonymized spreadsheet will go up on figshare; if enough more copies sell (or some other form of funding comes through), the study will be continued in 2016 for 2015 publications.

And you can buy the book through Amazon (and possibly Ingram), although it counts three times as much toward sales goals if you buy through Lulu.

This should not be my fight

October 7th, 2015

I’ve probably said this before, but thinking about yesterday’s post reminded me of it once again.

That is:

This should not be my fight.

No, I haven’t gone to each site that wrote a story touting the Shen/Björk article to point out the problems with the data–especially now that it’s clear what the response will be. Somebody should. They have the actual data.

But it shouldn’t be me. It’s really not my fight.

I didn’t even start out to discredit Beall’s lists. I did cross swords with him on his absurd notion that the Big Deal had solved the serials crisis, but I did a real-world study of the journals and publishers on his lists to get a reality check. I was fully ready to believe that the picture was as bleak as he painted it–and if that’s how the data had come out, that’s how I would have published it.

After all: I don’t publish any OA journals. I’m not on the editorial board of any OA journals. I don’t need publications for tenure (I’m retired and was never in a tenure-track position). I don’t make big bucks from speaking fees (haven’t done many appearances lately, and that’s OK). I sure as heck don’t make big bucks from the data gathering and analysis, although ALA Editions has published some of my work in the area (not big bucks, but some bucks and a venue I regard highly).

For that matter, I’ve been the subject of ad hominem attacks from Stevan Harnad as well as Jeffrey Beall, so I’m not even well-liked among all OA folks.

What I’ve been trying to do is see what’s actually happening and bring my 26 years of off-and-on experience with OA to bear in looking at what’s going on now and what’s being said. My mildly obsessive personality and retired status, and reasonably well organized techniques, have allowed me to do some large-scale studies that wouldn’t have been done otherwise. (With modest funding, I’d keep on doing them.)

It’s painful to see questionable results spread far and wide: it hurts good OA (the bulk of it) and probably doesn’t do much to questionable OA. It’s painful to see librarians and others take the easy way out, relying on a seriously defective set of blacklists rather than starting with an increasingly good whitelist (DOAJ) and working from there.

I’ll continue to provide facts and perspectives. (I’ve just subdivided a bunch of tagged items into a baker’s dozen subtopics within the overall “Ethics and Access” topic. That’s probably the December Cites & Insights; it might also be the January one, depending on how it goes.) I’ll continue to post the occasional post. I’m hoping some libraries, librarians, OA folks and others will eventually buy the book (which is apparently now available on Amazon as well as via Lulu; it may also be on Ingram, but I have no way of testing that). It’s always a pleasure to see my work being cited or used where it’s appropriate.

I’m not going away just yet…but as for coping with all the misrepresentations well, it’s not (or at least not entirely) my fight.

For those of you who need a Respectable Published Source:

I refer you to Open-Access Journals: Idealism and Opportunism, published by the American Library Association. That link gets you to the $43 40-page monograph (published as the August/September 2015 issue of Library Technology Reports). You can also go here to read the first chapter or order the ebook version (I believe you can also order individual chapters). If you’re in one of the several hundred libraries that subscribes to Library Technology Reports, it should already be available to you. (The link here is to one of two records for the series.)

Open-Access Journals: Idealism and Opportunism was professionally copy-edited, edited, and typeset. It was also reviewed by three professionals (two librarians, one other), although that wasn’t formal peer review. It’s concise, and includes not only real-world figures for 6,490 gold OA journals (in DOAJ) publishing 366,210 articles in 2013, it includes chapters on the “sideshow” of Beall’s lists, dealing with OA journals (including spotting questionable journals), and libraries and OA journals.

(It’s not a complete survey of DOAJ, because it doesn’t include journals that lack an English-language interface option. It also goes through June 30, 2014 rather than the end of 2014–thus, the 366,210 count is for 2013). It’s also, of course, far less detailed than The Gold OA Landscape 2011-2014.

But it’s concise, well-edited, based on an actual survey rather than sampling, and published by what I consider to be the premier publisher in librarianship, part of the world’s largest library association. So it has that level of authority that my self-pubbed works may not have.

The author? Walt Crawford. (No, I’m not angling for extra money here: the fee for preparing the issue was a one-time fee, with no royalties. But the final chapters make it a great resource, and for those who require Reputable Publishers, you don’t get more reputable than ALA.)



The Gold OA Landscape 2011-2014: a brief note on numbers

October 6th, 2015

oa14c300Here’s the tl;dr version: Go buy The Gold OA Landscape 2011-2014, either the $60 paperback or the $55 site-licensed PDF ebook (the contents are identical other than the copyright page/ISBN). I try to be wholly transparent about my investigations, and I’m confident that TGOAL represents the most accurate available count for serious gold OA publishing (excluding non-DOAJ members, “hybrids” and other stuff). Oh, and if enough copies are sold, I’ll keep doing this research…which I don’t think anybody else is going to do and which, as far as I can tell, can’t really be automated.

Running the Numbers

Now that I’ve said that, I won’t repeat the sales pitch. You presumably already know that you can get a hefty sampling of the story in Cites & Insights 15:9–but the full story is much more complete and much more interesting.

Meanwhile, I’ve gotten involved or failed to get involved in a number of discussions about numbers attached to OA.

On September 30, I posted “How many articles, how many journals?,” raising questions about statistics published in MDPI’s Sciforum asserting the number of OA journals and articles–numbers much lower than the ones I’ve derived by actual counting. I received email today regarding the issues I raised:

Thank you for passing this on. I think it’s quite difficult to pin down exactly how many papers are published, never mind adding in vagueries about the definition of ‘predatory’ or ‘questionable’ publishers. The data on Sciforum are taken from Crossref and, on, shows about 300,000 OA articles published in 2014. The difference may depend on correct deposition (including late or not at all), article types or publishers just not registered with Crossref. I think ball-park figures are about the closest we can get as things stand.

Well…yes and no. I think it’s highly likely that many smaller OA journals aren’t Crossref members or likely to become Crossref members: for little journals done out of a department’s back pocket, even $275/year plus $1/article is a not insignificant sum.

What bothers me here is not that the numbers are different, but that there seems to be no admission that a full manual survey is likely to produce more accurate numbers, not just a different “ball-park figure.” And that “pinning down” accurate numbers is aided by, you know, actually counting them. The Sciforum numbers are based on automated techniques: that’s presumably easy and fast, but that doesn’t make it likely to be right.

Then there’s the Shen/Björk article…which, as I might have expected, has been publicized all over the place, always with the twin effects of (a) making OA look bad and (b) providing further credibility to the one-man OA wrecking crew who shall go nameless here. The Retraction Watch article seems to be the only place there’s been much discussion of what may be wrong with the original article. Unfortunately, here is apparently the totality of what Björk chooses to say about mine and other criticisms:

“Our research has been carefully done using standard scientific techniques and has been peer reviewed by three substance editors and a statistical editor. We have no wish to engage in a possibly heated discussion within the OA community, particularly around the controversial subject of Beall’s list. Others are free to comment on our article and publish alternative results, we have explained our methods and reasoning quite carefully in the article itself and leave it there.”

Whew. No willingness to admit that their small sample could easily have resulted in estimates that are nearly three times too high. No willingness to admit that the author-nationality portion, based on fewer than 300 articles, is even more prone to sampling error. They used “standard scientific techniques” so the results must be accurate.

No, I’m not going around to all the places that have touted the Shen/Björk article to add comments. Not only is life too short, I don’t believe it will do much good.

The best I can do is transparent research with less statistical inference and more reliance on dealing with heterogeneity by full-scale testing, and hope that it will be useful. A hope that’s sometimes hard to keep going.

Meanwhile: I continue to believe that a whitelist approach–DOAJ‘s tougher standards–is far superior to a blacklist approach, especially given the historical record of blacklists.



Cites & Insights 15:10 (November 2015) available

October 5th, 2015

Cites & Insights 15:10 (November 2015) is now available for downloading at

This print-oriented two-column version is 38 pages long. If you plan to read the issue on a tablet or computer, you may prefer the 6″x9″ single column version, 74 pages long, which is available at

Unlike the book-excerpt October 2015 issue, there’s no advantage to the single-column version (other than its being single-column), and copyfitting has only been done on the two-column version. (As has been true for a couple of months, both versions do include links, bookmarks and visible bolding.)

This issue includes the following essays, stepping away from open access for a bit:

The Front: A Fair Use Trilogy   p. 1

A few notes about the rest of the issue–and a status report on The Gold OA Landscape 2011-2014.

Policy: Google Books: The Neverending Story?  pp. 1-18

Three years of updates on the seemingly endless Google Books story, which has now become almost entirely about fair use.

Policy: Catching Up on Fair Use  pp. 18-24

A handful of items regarding fair use that don’t hinge on Google Books or HathiTrust.

Intersections: Tracking the Elephant: Notes on HathiTrust  pp. 24-38

Pretty much what the title says, and again the main thrust appears to be fair use. (The elephant? Read the essay, including a little bit of Unicode.)


Careful reading and questionable extrapolation

October 2nd, 2015

On October 1, 2015 (yesterday, that is), I posted “The Gold OA Landscape 2011-2014: malware and some side notes,” including this paragraph:

Second, a sad note. An article–which I’d seen from two sources before publication–that starts by apparently assuming Beall’s lists are something other than junk, then bases an investigation on sampling from the lists, has appeared in a reputable OA journal and, of course, is being picked up all over the place…with Beall being quoted, naturally, thus making the situation worse. I was asked for comments by another reporter (haven’t seen whether the piece has appeared and whether I’m quoted), and the core of my comments was that it’s hard to build good research based on junk, and I regard Beall’s lists as junk, especially given his repeated condemnation of all OA–and, curiously, his apparent continuing belief that author-side charges, which in the Bealliverse automatically corrupt scholarship, only happen in OA (page charges are apparently mythical creatures in the Bealliverse). So, Beall gains even more credibility; challenging him becomes even more hopeless.

When I’d looked at the article, twice, I’d had lots of questions about the usefulness of extrapolating article volumes and, indeed active journal numbers from a rather small sampling of journals within an extremely heterogeneous space–but, glancing back at my own detailed analysis of journals in those lists (which, unlike the article, was a full survey, not a sampling), I was coming up with article volumes that, while lower, were somewhere within the same ballpark (although the number of active journals was less than half that estimated in the article. (The article is “‘Predatory’ open access: a longitudinal study of article volumes and market characteristics” by Cenyu Shen and Bo-Christer Björk; it’s just been published.)

Basically, the article extrapolated 8,000 active “predatory” journals publishing around 420,000 articles in 2014, based on a sampling of fewer than 700 journals. And, while I showed only 3,876 journals (I won’t call them “predatory” but they were in the junk lists) active at some point between 2011 and June 2014, I did come up with a total volume of 323,491 articles–so I was focusing my criticism of the article on the impossibility of basing good science on junk foundations.

Now, go back and note the italicized word two paragraphs above: “glancing.” Thanks to an email exchange with Lars Bjørnshauge at DOAJ, I went back and read my own article more carefully–that is, actually reading the text, not just glancing at the figures. Turns out 323,491 is the total volume of articles for 3.5 years (2011 through June 30, 2014). The annual total for 2013 was 115,698; the total for the first half of 2014 was 67,647, so it’s fair to extrapolate that the 2014 annual total would be under 150,000.

That’s a huge difference: not only is the article’s active-journal total more than twice as high as my own (non-extrapolated, based on a full survey) number, the article total is nearly three times as high. That shouldn’t be surprising: the article is based on extrapolations from a small number of journals in an extremely heterogeneous universe, and all the statistical formulae in the world don’t make that level of extrapolation reliable.

Shen and Björk ignored my work, either because it’s not Properly Published or because they weren’t aware of it (although I’m pretty sure Björk knows of my work). They say “It would have taken a lot of effort to manually collect publication volumes” for all the journals on the list. That’s true: it was a lot of effort. Effort which I carried out. Effort which results in dramatically lower counts for the number of active journals and articles.

(As to the article’s “geographical spread of articles,” that’s based on a sample of 205 articles out of what they seem to think are about 420,000. But I didn’t look at authors so I won’t comment on this aspect.)

I should note that “active” journals includes those that published at least one article any time during the period. Since I did my analysis in late 2014 and cut off article data at June 30, 2014, it’s not surprising that the “active this year” count is lower for 2014 (3,014 journals) than for 2013 (3,282)–and I’ll agree with the article that recent growth in these journals has been aggressive: the count of active journals was 2,084 for 2012 and 1,450 for 2011.

I could speculate as to whether what I regard as seriously faulty extrapolations based on a junk foundation will get more or less publicity, citations, and credibility than counts based on a full survey–but carried out by an independent researcher using wholly transparent methodology and not published in a peer-reviewed journal. I know how I’d bet. I’d like to hope I’m wrong. (If not being peer-reviewed is a fatal problem, then a big issue in the study goes away: the junk lists are, of course, not at all peer reviewed.)



Mystery Collection Disc 45

October 2nd, 2015

The Manipulator, 1971, color. Yabo Yablonsky (dir & screenplay), Mickey Rooney, Luana Anders, Keenan Wynn. 1:25 [1:31]

No. No no no. It’s been almost six months since I watched one of these, and more like this could make me give up entirely. The plot, to the extent that I saw it: Mickey Rooney as a crazed old Hollywood person who carries all parts of a movie-making set of conversations as he bumps into thinks in an old prop warehouse…but he’s got an actress tied up as well (kidnapped and being slowly starved), and I guess that their interactions are the heart of the movie. But after 20 minutes, I just couldn’t—and wish I’d given up after ten.

I didn’t see Keenan Wynn during the chunk I watched. Looking at the IMDB reviews, I see one that values it as an experimental film and, well, I guess you can make the worst shit look like roses if you try hard enough. Another praises it for Rooney’s “extraordinarily uninhibited performance,” but several say things like “endurance test for the viewer” and “nearly unwatchable.” I’m with them: not only no redeeming value, but really nasty. No rating.

Death in the Shadows (orig. De prooi), 1985, color. Vivian Peters (dir.), Maayke Bouten, Erik de Vries, Johan Leysen, Marlous Fluitsma. 1:37.

This one’s pretty good—with plenty of mystery, although the metamystery’s easy enough to resolve. (The metamystery: why is a 1985 color film available in a Mill Creek Entertainment set? The answer: it’s from the Netherlands, has no stars known in America, and wouldn’t have done well as a U.S. release.)

In brief: an almost-18-year-old young woman finds that her mother was killed—and that her mother didn’t have any children. The young woman now lives alone (and her boyfriend/lover is leaving for a big vacation as it’s the end of the school year), and—sometimes working with a police detective, sometimes ignoring his advice—wants to know what happened. In the process, she almost gets run down (which is what happened to her mother), her mother’s brother gets murdered, and she avoids death. We find out what happened.

Moody, frequently dark, fairly well done. Maayke Bouten is quite effective as the young woman, Valerie Jaspers. but this is apparently her only actual film credit (she was 21 at the time, so 18 isn’t much of a stretch: she also did one TV movie and appeared as herself on a TV show). Not fast-moving and no flashy special effects, but a pretty good film. $1.50.

Born to Win, 1971, color. Ivan Passer (dir.), George Segal, Paula Prentiss, Karen BlackJay Fletcher, Hector Elizondo, Robert De Niro. 1:28 [1:24]

The disc sleeve identifies Robert De Niro as the star here, but this is very much a George Segal flick, with Karen Black and others—although De Niro’s in it (for some reason feeling to me like Billy Crystal playing Robert De Niro). The movie’s about a junkie (Segal) and…well, it’s about an hour and 24 minutes long.

Beyond that: poor editing, worse scriptwriting, continuity that deserves a “dis” in front of it. I got a hint in the first five minutes that this was going to have what you might call an “experimental” narrative arc, and so it was. Pretty dreary, all in all. Yes, it’s a low-budget indie with a great cast, but… (I will say: most IMDB reviews seem very positive. Good for them.) Charitably, for George Segal or Karen Black fans, maybe $0.75.

A Killing Affair, 1986, color. David Saperstein (dir.), Peter Weller, Kathy Baker, John Glover. 1:40.

A juicy chunk of Southern Gothic—set in West Virginia in 1943, starring Kathy Baker as the wife (or, really, property of a mill foreman who’s ripping off the employees, openly sleeping with other women, and generally a piece of work. A stranger comes to…well, not so much town as the house across the lake from town where Baker lives (with her children on weekends—during the week, they stay in town with her brother, the preacher who clearly believes that women are to Obey their husbands).

Ah, but shortly before the stranger (Peter Weller) shows up, she discovers that her rotten husband is now hanging in the shed, very much dead. She makes some efforts to get help but isn’t quite willing to walk two miles to town (the boat’s gone), so… Anyway, the stranger shows up and Plot happens. Part of it: he admits to killing her husband, but claims her husband killed his wife and children and was about to shoot him. And there are all sorts of family secrets involved in her past. A pack of wild dogs also plays a role throughout the flick, especially in the climax.

Languid most of the time, with an unsurprising ending. Not terrible, not great; Weller’s a pretty convincing mentally unstable (but smooth!) killer, and Baker’s pretty much always good, and certainly is here. (How does a movie this recent and plausibly good wind up in a cheap collection? I have no idea.) I’ll give it $1.25.

The Gold OA Landscape 2011-2014: malware and some side notes

October 1st, 2015

First, a very brief status report. As of this morning, the book has sold five copies (four paperback, one ebook)–exactly the same numbers as a week ago, September 24, 2015. This is, how you say, not especially rapid progress toward the twin goals of making the data available and carrying forward the research into 2016. (Meanwhile, the October 2015 Cites & Insights has been downloaded at least 1,300 times so far–about 85% of those downloads being the more-readable single-column version of this excerpted version of The Gold OA Landscape 2011-2014. (If one out of every 20 downloads yielded a sale of the book, that would meet the data-availability goal and probably the next-year’s-research goal…)

Second, a sad note. An article–which I’d seen from two sources before publication–that starts by apparently assuming Beall’s lists are something other than junk, then bases an investigation on sampling from the lists, has appeared in a reputable OA journal and, of course, is being picked up all over the place…with Beall being quoted, naturally, thus making the situation worse. I was asked for comments by another reporter (haven’t seen whether the piece has appeared and whether I’m quoted), and the core of my comments was that it’s hard to build good research based on junk, and I regard Beall’s lists as junk, especially given his repeated condemnation of all OA–and, curiously, his apparent continuing belief that author-side charges, which in the Bealliverse automatically corrupt scholarship, only happen in OA (page charges are apparently mythical creatures in the Bealliverse). So, Beall gains even more credibility; challenging him becomes even more hopeless. [See this followup post]

Third, a somewhat better note: Cheryl LaGuardia has published “An Interview with Peter Suber” in her “Not Dead Yet” column at Library Journal. If you haven’t already read it, you should. A couple of key quotes (in my opinion):

Not all librarians are well-informed about OA, but as a class they’re much better informed than faculty.

First, scam OA journals do exist, just as scam subscription journals exist. On the other side, first-rate OA journals also exist, just as first-rate subscription journals also exist. There’s a full range of quality on both sides of the line. Authors often need help identifying the first-rate OA journals, or at least steering clear of the frauds, and librarians can help with that. The Directory of Open Access Journals (DOAJ) is a “white list” of trustworthy OA journals…

I used to think [“hybrid” OA] was good, since at least it gave publishers first-hand experience with the economics of fee-based OA journals. But I changed my mind about that years ago. Because these journals still have subscriptions, they have no incentive to make the OA option attractive. The economics are artificial. Moreover, as I mentioned, most hybrid OA journals double-dip, which is dishonest. But even when it’s honest, it’s still a small OA step that’s often mistaken for a big step.

Finally, the direct tie-in to the book…and to the second quote from the Suber interview.


The excerpted version omits the whole section on exclusions–DOAJ-listed journals that weren’t included in the study for a variety of reasons. In most cases, it’s not necessarily that these journals are scam journals (the term “predatory” has been rendered meaningless in this context) but that, for one reason or another, they either don’t fit my definition of a gold OA journal devoted to peer-reviewed articles or that I was simply unable to analyze them properly.

One unfortunate subcategory includes 65 journals, which is 65 more than should appear in this category: journals with malware issues. My best guess is that some of these will disappear from DOAJ and that others either try too hard for ad revenue (accepting ads that incorporate malware) or have been badly designed, or for that matter use some convenient add-in for the website that just happens to carry malware. I don’t believe there’s any excuse for a journal to raise malware cautions–even if some of the defense tools I use might be overly cautious. (I added Malwarebytes after an OA journal infected my PC with a particularly nasty bit of malware, and at least two others attempted to load the same malware. It took me two days to get rid of the crap, and I have no interest in repeating that process. McAfee Site Adviser seems to be omnipresent in browsers and new computers, and since it’s now part of Intel I see no reason to distrust it.)

In any case: since it doesn’t look like OA publishers are rushing to buy the book and dig through it (I know, it’s early days yet), I’ll include that section here–the single case in which I actually list journal titles other than PLOS One (which I mention by name in the book because I excluded it from subject and segment discussions in order to avoid wrecking averages and distributions, since it is more than six times as large as any other OA journal).

Here’s the excerpt:

M: Malware

When attempting to reach these journals’ webpages, either Microsoft Office, McAfee Site Advisor, Windows Defender or Malwarebytes Anti-Malware threw up a caution screen indicating that the site had malware of some sort. (Actually, in one case the website got past all four—and showed an overlay that was a clear phishing attempt.)

In some few cases, the warning was a McAfee “yellow flag”; in most, it was either a McAfee red flag or Malwarebytes blocked the site.

Given that I encountered a serious virus with at least three different journals in a previous pass (getting rid of the virus is one reason I now run Malwarebytes as well as Windows Defender; note that I do not run McAfee’s general suite, but only the free Site Advisor that flags suspicious websites on the fly), I was not about to ignore the warnings and go look at the journals. I’d guess that, in some cases, the malware is in an ad on the journal page. In any case, it’s simply not acceptable for an OA journal to have malware or even possible malware.

I find it sad that there are 65 of these. They are not dominated by any one country of publication: 27 countries are represented among the 65 offending sites, although only a dozen have more than one each. The countries with more than three possible-malware journals include Germany and India (seven each), Brazil (six), Romania and the Russian Federation (five each), and the United States (four).

Malware Possibilities

While this report generally avoids naming individual journal titles or publishers, since it’s intended as an overall study, I think it’s worth making an exception for these 65 cases. These journals may have fixed their problems, but I’d approach with caution:

Acta Medica Transilvanica

Algoritmy, Metody i Sistemy Obrabotki Dannyh

Analele Universitatii din Oradea, Fascicula Biologie

Andhra Pradesh Journal of Psychological Medicine

Annals and Essences of Dentistry

Applied Mathematics in Engineering, Management and Technology

Avances en Ciencias e Ingeniería


Breviário de Filosofia Pública

Chinese Journal of Plant Ecology

Communications in Numerical Analysis

Confines de Relaciones Internacionales y Ciencia Política

Contemporary Materials

Data Envelopment Analysis and Decision Science



Economic Sociology

Education Research Frontier


European Journal of Environmental Sciences

Exatas Online

Filosofiâ i Kosmologiâ

Forum for Inter-American Research (Fiar)


Global Engineers and Technologists Review

Health Sciences and Disease

Impossibilia : Revista Internacional de Estudios Literarios

International Journal of Academic Research in Business and Social Sciences

International Journal of Ayurvedic Medicine

International Journal of Educational Research and Technology

International Journal of Information and Communication Technology Research

International Journal of Pharmaceutical Frontier Research

İşletme Araştırmaları Dergisii

Journal of Behavioral Science for Development

Journal of Community Nutrition & Health

Journal of Interpolation and Approximation in Scientific Computing

Journal of Management and Science

Journal of Nonlinear Analysis and Application

Journal of Numerical Mathematics and Stochastics

Journal of Soft Computing and Applications

Journal of Wetlands Environmental Management

Kritikos. Journal of postmodern cultural sound, text and image

Latin American Journal of Conservation

Mathematics Education Trends and Research

Nesne Psikoloji Dergisi

Networks and Neighbours


Potravinarstvo : Scientific Journal for Food Industry

Proceedings of the International Conference Nanomaterials : Applications and Properties

Psihologičeskaâ Nauka i Obrazovanie

Psikiyatride Guncel Yaklasimlar

Regionalʹnaâ Èkonomika i Upravlenie: Elektronnyi Nauchnyi Zhurnal

Revista Caribeña e Ciencias Sociales

Revista de Biologia Marina y Oceanografia

Revista de Educación en Biología

Revista de Engenharia e Tecnologia

Revista de Estudos AntiUtilitaristas e PosColoniais

Revista Pădurilor

Romanian Journal of Regional Science

Studii de gramatică contrastivă

Tecnoscienza : Italian Journal of Science & Technology Studies

Tekhnologiya i Konstruirovanie v Elektronnoi Apparature

Vestnik Volgogradskogo Gosudarstvennogo Universiteta. Seriâ 4. Istoriâ, Regionovedenie, Meždunarodnye Otnošeniâ