Opinion
Education

Mentorship Lessons from Growing a Developing Country Journal

Experiences developing and using a quality open access journal to mentor emerging academics.

Posted
graphic designer at work

There is a large space between high-impact and low-quality journals, often referred to as “predatory” following publication of Beall’s list.1 That so many low-quality journals exist is testimony to the fact that many developing country authors who need to publish because of career or degree requirements find the bar to publishing in quality journals too high.4 Wide variability in what a credible journal or conference paper is across disciplines makes it difficult for government regulators to enforce quality standards. Publish-or-perish pressures for academics—difficult enough in developed countries—are particularly taxing on academics short of experienced mentors or role models within their own society.

South African Computer Journal (SACJ) was established before online publishing existed but nonetheless is subject to similar pressures to developing country journals as it switched from paper to online publishing relatively recently (2010), when predatory journals were already an issue. When I took over as editor-in-chief (EiC), SACJ was moribund so I was in a position to review its positioning. I learned lessons that could be useful to others trying to build a credible journal against two types of competition: established journals and predatory journals. One measure in my view of journal quality is the extent to which it practices mentorship—at very least, this should be through useful feedback from reviews.

While SACJ was “established,” it was not widely known because the subscription model worked against visibility of a journal of a small academic community. My aim was to publish quality papers with a developmental agenda: authors—particularly novices without local mentors—should learn from reviewer or editor feedback. My own history provided helpful perspective. I joined academia in 1981 in a department with little experience. I learned the value of mentorship with sabbaticals at top U.S. universities. Being in one of South Africa’s most research-intensive universities, University of the Witwatersrand, was no help. Potential mentors from other fields did not understand computer science, nor did we get much help from other better-established academics in other universities. As a journal editor, I could at least partially fill that gap. Exactly how was not obvious; having since developed some ideas, I aim to pass them on here.

The South African Department of Higher Education and Training pays a substantial subsidy to universities to incentivize publication in accredited journals. Journals can be accredited for various reasons, including being on a major index such as Web of Science (WoS) or Scopus, being in the SciELO South Africa collection (an open access journal aggregator) and being in the Directory of Open Access Journals (DOAJ). Other developing countries face their own publish-or-perish pressures,4 hence the large number of predatory journals.

In what follows, I explain how I positioned SACJ to fit a niche: a quality regionally focused journal with a reasonable bar for acceptance, with a developmental agenda. I briefly review the history of SACJ then look at the wider space. I describe the changes I made to SACJ during my 2012–2022 term as EiC and assess potential for other similar journals to emerge, to bridge the gap between predatory and prestigious journals. While issues I raise apply to poorly resourced countries, similar challenges can occur in poorly resourced contexts in wealthier countries, as my early-career experience shows.

SACJ Starting Point

I summarize here key points of the history of the journal up to when I took over in mid-2012 and key points of my turnaround plan. South African Computer Journal began as Quæstiones Informaticæ (QI) in 1979. Until 2010, the journal was available by subscription. Its publisher is the South African Institute of Computer Scientists and Information Technologists (SAICSIT).

The July 2011 issue was the first to use open access, using Open Journal Systems (OJS). Much of the pre-production workload was converting accepted papers to our published format. We permitted Microsoft Word and LaTeX submissions; Microsoft Word documents were by far the more onerous to convert. Appointing a production editor relieved the EiC of this workload. To pay the production editor, we instituted a publication charge of R6000 (South African rand, ZAR; when I took over the journal in June 2012, one U.S. dollar bought about R8.50, in April 2024 approximately R19). This amount is approximately 5% of the government subsidy for a journal publication. South African universities are therefore generally happy to pay this amount. SACJ does not invoice before a paper is accepted and the fee is waived if the author cannot fund it.

Insisting on submissions using LaTeX would have added another obstacle, contrary to our developmental agenda. Final editing in Microsoft Word format in my experience is more work than conversion to LaTeX, for example, proper referencing and more complex layout elements such as tables are easier in LaTeX.

Our biggest challenge was predatory journals. They make publishing easy—sometimes with ludicrously short turnaround times and inexpensive publication charges. Yet they damage the reputation of open access (OA). Compared with predatory OA journals, we predate Internet publishing; our track record goes back decades and we have a commitment to quality.

The Predatory Landscape

Jeffrey Beall listed attributes of predatory publishers,1 including soliciting high fees only after publishing. He also reports academics being listed on an editorial board without their consent. The predatory field has since moved on. Fees have become much lower although some things have not changed. Predatory journal websites have overblown claims and lofty self-promoting prose, often in poor English. I also receive email messages soliciting a paper, citing one of my own publications as an example of what they are looking for—albeit in a completely unrelated field.

Published papers are often of poor quality, with flaws such as a literature review that is little more than a table of reference citations. Authors used to this “standard” who submit to other journals are likely to be summarily rejected. The target of this sort of scam is beginner academics or graduate students without experienced mentors and this is compounded in lower-resourced environments.

Credibility and Visibility

In South Africa, in recent years, some universities, chasing international rankings, have started encouraging publishing only in journals at least on Scopus, preferably WoS. At the same time, the government’s criteria for accreditation have broadened; the most recent update (in 2021) added anything listed by DoAJ.3

My approach was to add as many options for improved visibility and credibility as possible, including:

  • special issues

    • extended conference papers

    • thematic issues

  • registering a DOI for each published article

  • listing on DOAJ

  • listing on Scopus

  • inclusion on Scientific Electronic Library Online South Africa (SciELO SA)

Any of the last three is sufficient for South African government accreditation.

DOAJ has some credibility checks so it is useful to join, not purely for visibility. Scopus sets a higher bar for acceptance, including checking for citation counts of content, a checklist of items like regularity of publishing and having an ISSN, as well as review by an expert panel. WoS, which we aspire to join, has 28 criteria, including four related to citation counts and 24 quality measures. Admission to Scopus in 2016 signals we are at a reasonable level; being invited to join SciELO SA in 2016 indicates credibility in the broader South African community.

Registering DOIs aids visibility as some search engines crawl the DOI database. DOIs also allow freedom to move the journal site. Requiring DOIs whenever possible in references makes it easier to check correctness of reference lists. Using DOIs is not a differentiator from predatory journals, however. Competing in the middle ground of a quality journal with a developmental role is not easy. Authors who aspire to publish in top-ranked journals will go there and authors who require quick publication to check a box want guaranteed turnaround time; that can only be achieved with either a high reject rate or giving up on quality.

A Developmental Agenda

It is necessary to be clear on the goal to maintain quality to ensure differentiation from the cottage industry in predatory publishing. Another problem with fitting into this middle ground is the rubbery definition of a journal versus conference proceedings. In computing disciplines, some say conferences can be as high in impact as a good journal5—but opinions vary.2

Persuading funders to treat a discipline differently is a losing battle so it was tempting to publish conference papers in SACJ. However, to do so would be to abandon our developmental agenda. Part of the journal-publishing experience is reading reviews to re-submit, which is seldom possible in the timelines of conferences.

The one area where a quick response is possible without compromising quality is rejecting papers that are clearly far from publishable, particularly those typical of predatory journals, with obvious flaws like poor language, plagiarism, inadequate literature review, and no clear contribution. As EiC, I shielded my team from dealing with such papers but also helped authors by providing quick feedback, including advising reading my own journal for examples of acceptable work. A desk reject of an obviously unsuitable paper is relatively little work, and I tried to do these within a week of submission.

Another approach we used is extended conference papers, with at least 30% substantial new material and a new title. Thematic special issues with guest editors also increased submissions and spread experience of editing, another form of mentorship. Reviewing is also a form of mentorship, as editors can give reviewers feedback.

When I was editor (June 2012–June 2022), special issues papers were 24% of the total. We put more effort into encouraging special issues following relatively lean years. To maintain consistent standards, we gave final say on acceptance to the regular team. Problems I have seen include guest editors accepting a paper with only negative reviews and stalling when the first round of invitations do not yield responses. On the positive side, a good guest editor is a candidate to invite onto the editorial team. From struggling to have enough submissions when I took over in 2012, we have in recent years turned away proposals for large numbers of papers, particularly those that presuppose a high acceptance rate.

SACJ has shown that this it is possible to carve out a niche that promotes quality while remaining accessible to developing authors. I would drop publication charges if we could find a sponsor because of the administrative burden. However, OA is the way to go for any journal planning to increase its impact. The key is to differentiate clearly from predatory journals.

By the end of my term as EiC, my key discoveries of the role of a journal such as SACJ in mentorship were:

  • quick feedback to authors whose papers are nowhere near acceptable,

  • build the author pool by special issues and extended conference papers, and

  • build the pool of experience by mentoring guest editors

How can others draw on these lessons? I would like to see a developmental approach to adding open access journals to the ACM library. The International Conference Proceedings Series (ICPS) provides a model, with conferences independently run and screened for quality. Early in my SACJ EiC term, I approached ACM about including it in the Digital Library, and the response to adding an OA journal was hostile. Possibly this has changed as ACM now at least partially embraces open access. What is missing is a mentoring model for editors that could build up those who aspire to the sort of quality standard that is missing from predatory OA journals.

    References

    • 1. Beall, J. Predatory publishers are corrupting open access. Nature 489, 7415 (2012); 10.1038/489179a
    • 2. Freyne, J. et al. Relative status of journal and conference publications in computer science. Commun. ACM 53, 11 (Nov. 2011); 10.1145/1839676.1839701
    • 3. Piron, F. et al. Trying to say ‘no’ to rankings and metrics: Case studies from Francophone West Africa, South Africa, Latin America and the Netherlands. In Socially Responsible Higher Education. Brill, Boston (2021), Chapter 22; 10.1163/9789004459076_023
    • 4. Sharma, A. et al. The Indian struggle against predatory journals: The importance of quality control. J. Pharmaceutical Negative Results 14, 1 (2023), 556560.
    • 5. Vrettas, G. and Sanderson, M. Conferences versus journals in computer science. J. Association for Information Science and Technology 66, 12 (2015); 10.1002/asi.23349

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More