Garbage in, Garbage Out? How to Avoid Citing from Questionable Sources in Systematic Literature Reviews
by Rüdiger Hahn
Systematic literature reviews critically assess a body of literature, summarizing the field's key findings and identifying limitations (e.g., Denyer & Tranfield, 2009; Kunisch et al., 2023; Siddaway et al., 2019). In business and management sciences, these reviews typically prioritize studies published in peer-reviewed journals, which are regarded as the gold standard in academic publishing. However, recent debates and critiques have highlighted concerns around predatory or near-predatory journals and paper mills which compromise the standards of peer review processes.
Predatory journals, often charging authors fees, fail to provide the rigorous editorial and review services characteristic of reputable publications. Despite claims of strict peer review, investigations reveal these processes to be ineffective (Hahn, 2024a, b). Furthermore, paper mills, generate and sell fraudulent research papers. They enable authors to submit these works to journals, thus compromising scientific integrity (McKie, 2024) and there is initial evidence that some predatory or near-predatory journals seem to have close ties to paper mills (Albanika, 2023). The proliferation of such dubious papers poses clear risks:
In disciplines like medicine or psychology, guidelines for conducting literature reviews often stress the importance of assessing the empirical quality of each study (e.g., Fink, 2019; Siddaway et al., 2019). This paper-by-paper quality assessment is advisable as it recognizes the potential for valuable contributions even within suspect journals, and conversely, the possibility of flawed studies in reputable ones. However, this meticulous approach has limitations: it demands considerable methodological knowledge from the reviewer and tends to be more practical for reviews focused on specific methodologies, common in fields with dominant research paradigms. Business research, with its methodological diversity—from conceptual to mixed-method studies—presents a particular challenge in applying uniform quality criteria. The wide range of methodologies, including but not limited to experimental designs, surveys, structural equation modeling, and regression analysis, complicates the establishment of overarching quality standards.
Consequently, perceived journal quality could serve as an indirect measure for study quality, guiding the inclusion or exclusion of papers in a review. This method relies on predefined lists of journals, sidestepping the need for detailed quality assessment of each study. It is based on the premise that journal editors from reputable journals aim to minimize Type 2 errors—accepting low-quality papers—to protect their reputation. While this approach assumes that high-quality journals diligently avoid such errors (a belief supported by my experience and indicative of high rejection rates, at the same time a source of frustration for many authors as this inevitably increases the chances of Type 1 errors—rejecting good quality papers), it significantly reduces the risk of incorporating questionable research. Although not foolproof, this strategy offers a practical and efficient option, particularly for novice researchers.
Regrettably, dubious journals sometimes find their way into esteemed databases, only being occasionally expunged (Hanson, 2023; Retraction Watch, 2023), necessitating a manual selection process for inclusion in literature reviews. This practice, which I regularly employ in my work (refer to Sch?tzlein et al., 2023; Schlütter et al., 2023), involves two main strategies: employing blacklists to exclude certain journals or publishers, and utilizing whitelists to specifically include others.
Regarding blacklists, a notable resource, Beall's List, originated from Jeffrey Beall's efforts to catalog publishers engaged in predatory practices. Despite its valuable contribution, the list's methodology and lack of transparency have faced criticism (Kimotho, 2019). Presently, it receives updates from anonymous volunteers at https://beallslist.net/ . Similarly, https://predatoryreports.org offers another exhaustive blacklist, maintained by anonymous researchers, dedicated to identifying disreputable journals and publishers. The criteria for inclusion on these lists remain somewhat unclear, raising questions about their selection processes. However, from a personal and subjective standpoint, I find both lists to be thorough and well-maintained, offering crucial insights into the landscape of predatory publishing.
Contrary to blacklists, whitelists adopt an inclusive strategy by highlighting reputable journals. Among such tools, the Scimago Journal Rank (SJR) is notable for evaluating journals’ scientific impact through citation metrics and the prestige of citing sources (Guerrero-Bote and Moya-Anegón, 2012). The SJR organizes journals into categories based on their field and ranks them in quartiles from Q1 (indicating the highest influence) to Q4 (indicating the least). This stratification across various disciplines supports a comprehensive and interdisciplinary review process. Nevertheless, the inclusion of journals with controversial reputations in higher quartiles, such as "Sustainability" published by MDPI (Crosetto, 2021; Oviedo-García, 2021), highlights the critical need for careful journal selection. To minimize the risk of endorsing questionable publications, focusing on Q1 journals provides a more reliable benchmark. Additionally, integrating whitelisting with blacklisting strategies may offer a balanced approach to journal evaluation.
Another option is whitelisting through disciplinary journal ratings. These ratings offer nuanced insights but vary in scope and detail. For instance, the FT50 list encapsulates the top 50 journals across business research domains, underscoring the pinnacle of academic publishing in the field (Ormans, 2016). While these journals represent the zenith of their respective areas, the list’s breadth—spanning accounting to strategic management—means its exclusivity might omit other highly esteemed publications. Consequently, I advocate for a more inclusive approach to whitelisting, cautioning against overly narrow selections that might bypass significant and valuable research contributions.
My preferred approach to refining a list of journals thus involves utilizing broader ratings systems. The Academic Journal Guide, issued by the Chartered Association of Business Schools, offers an esteemed international rating for business publications, categorizing over 1,500 journals on a scale from “1” (modest standard research) to “4” and “4*” (highest quality research) (Chartered Association of Business Schools, 2021). Similarly, the Journal Quality List by the Australian Business Deans Council (ABDC), encompassing over 2,500 journals (ABDC, 2023). Anne-Wil Harzing’s Journal Quality List provides a comprehensive overview of various such ratings in the field of business research, making it valuable starting points for conducting your assessment (Harzing, 2023).
In conclusion, the evaluation of journal quality encompasses a spectrum of strategies, each with its unique advantages and challenges. As scholars navigate this terrain, the choice of approach—be it the use of whitelists, blacklists, or a blend of metrics—should be tailored to the specific context and objectives of their research. This nuanced decision-making process underscores the importance of critical assessment in upholding the caliber of academic inquiry.
?
References:
ABDC (2023). Australian Business Deans Council 2022 Journal Quality List Review Final Report, March 15th 2023, https://abdc.edu.au/wp-content/uploads/2023/03/ABDC-2022-Journal-Quality-List-Review-Report-150323.pdf
Albanika, A. (2023). Publication and collaboration anomalies in academic papers originating from a paper mill: Evidence from a Russia-based paper mill. Learned Publishing, 36(4), 689-702, doi: 10.1002/leap.1574
Chartered Association of Business Schools. (2021). Academic Journal Guide 2021 - Methodolo-gy. https://charteredabs.org/wp-content/uploads/2021/06/Academic_Journal_Guide_2021-Methodology.pdf
领英推荐
Crosetto, P. (2021). Is MDPI a predatory publisher? Last updated April 20th 2021, https://paolocrosetto.wordpress.com/author/milanphd/
Denyer, D., & Tranfield, D. (2009). Producing a systematic review. In D. A. Buchanan & A. Bryman (Ed), The Sage handbook of organizational research methods: Sage Publications Ltd, 671–89.
Ormans, L. (2016). 50 Journals used in FT Research Rank. https://www.ft.com/content/3405a512-5cbb-11e1-8f1f-00144feabdc0
Fink, A. (2019). Conducting Research Literature Reviews - From the Internet to Paper. 5th ed. Sage Publishing. ISBN 9781544318479
Guerrero-Bote, V. P., & Moya-Anegón, F. (2012). A further step forward in measuring journals’ scientific prestige: The SJR2 indicator. Journal of Informetrics, 6, 674–88.
Hahn, R. (2024a). ?https://www.dhirubhai.net/posts/rudigerhahn_peerreview-rigor-relevance-activity-7139901675749699586-DXCz/
Hahn, R. (2024b).? https://www.dhirubhai.net/posts/rudigerhahn_rigor-relevance-mpdi-activity-7137349865180663808-XOfd/
Hanson, M.A. (2023). MDPI mega-journal delisted by Clarivate / Web of Science, March 23rd 2023, https://mahansonresearch.weebly.com/blog/mdpi-mega-journal-delisted-by-clarivate-web-of-science
Harzing,?A.?W. (2023). Journal Quality List, 70th ed. https://harzing.com/download/jql70a-title.pdf
Kimotho, S. G. (2019). The storm around Beall’s List. African Researc Review, 13(2), Serial No. 54. https://doi.org/10.4314/afrrev.v13i2.1
Kunisch, S., Denyer, D., Bartunek, J. M., Menz, M., & Cardinal, L. B. (2023). Review research as scientific inquiry. Organizational Research Methods, 26(1), 3-45. https://doi.org/10.1177/10944281221127292
McKie, R. (2024). ?‘The situation has become appalling’: fake scientific papers push research credibility to crisis point. The Guardian, Feb 3rd 2024, https://www.theguardian.com/science/2024/feb/03/the-situation-has-become-appalling-fake-scientific-papers-push-research-credibility-to-crisis-point
Oviedo-García, M. A. (2021). Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI). Research Evaluation, 30(3), 405–419a, https://doi.org/10.1093/reseval/rvab020
Retraction Watch (2023). Nearly 20 Hindawi journals delisted from leading index amid concerns of papermill activity, March 21st 2023, https://retractionwatch.com/2023/03/21/nearly-20-hindawi-journals-delisted-from-leading-index-amid-concerns-of-papermill-activity/
Sch?tzlein, L., Schlütter, D., & Hahn, R. (2023). Managing the External Financing Constraints of Social Enterprises: A Systematic Review of a Diversified Research Landscape. International Journal of Management Reviews, 25(1), 176-199. https://doi.org/10.1111/ijmr.12310
Schlütter, D., Sch?tzlein, L., Hahn, R., & Waldner, C. (2023). Missing the Impact in Impact Investing Research – A Systematic Review and Critical Reflextion of the Literature. Journal of Management Studies, OnlineFirst, https://doi.org/10.1111/joms.12978
Siddaway, A. P., Wood, A. M. and Hedges, L. V. (2019). ‘How to do a systematic review: A best practice guide for conducting and reporting narrative reviews, meta-analyses, and meta-syntheses’. Annual Review of Psychology, 70, 747–70. https://doi.org/10.1146/annurev-psych-010418-102803
Assistant Professor of Sustainable Business
8 个月Interesting that Journal of Cleaner Production does not appear on Harzing's list of journal ranking lists. It's also a bit funny, if expected, that Harzing's document includes, "We would be concerned if the list were used for staff evaluation purposes in?a mechanistic way." That's of course exactly how lists like this are used.
Research Mentor | CYGNA Founder | Emerita Professor at Middlesex University. Website: harzing.com 2022: Started #PositiveAcademia initiative. 2023: Co-founded the Positive Academia Network.
8 个月Thanks for sharing Sven Kunisch and a very nice piece with a well-chosen image which "says it all" ??
University of St.Gallen
8 个月Very inspiring article to shed light on the growing review research: systematic literature review, while striving for breadth, may still omit significant outlets. So as indicated in the post, I also contend that transparently concise disclosure and discussions in this regard are critical and quality-signaling.
Investigador ???? - Business, entrepreneurship, ecotourism, mangrove ecosystems - | Afiliado al MSc Management en @University of Liverpool ????| Ingeniero Industrial |
8 个月Thank you so much, it is a very useful content!!!
Professor für Nachhaltigkeitsmanagement & CSR
8 个月Thank you, Sven Kunisch.