What is this, lurking at the bottom of the most recent ONTOX Insights newsletter? It is an experiment in a new format for methods submissions, that focuses on providing templates for study protocols - making public all the ingredients of the secret sauce that goes into designing your study, to make it easier for other labs to use or replicate your approach. Don't leave this information trapped in a PhD or postdoc's head - submit it to us and get it published! (If you have any questions, don't hesitate to get in touch with Paul Whaley, our Editor-in-Chief.) https://lnkd.in/g7-SrE6T
关于我们
Evidence-Based Toxicology is a gold open-access journal, created to support open science practices and the use of evidence-based methods in the toxicological and environmental health sciences. It is the official journal of the Evidence-Based Toxicology Collaboration.
- 网站
-
https://www.tandfonline.com/journals/tebt20
Evidence-Based Toxicology的外部链接
- 所属行业
- 研究服务
- 规模
- 2-10 人
- 总部
- Baltimore
- 类型
- 合营企业
- 创立
- 2023
- 领域
- Academic publishing
地点
Evidence-Based Toxicology员工
动态
-
This piece in Nature seems to address the symptoms of the issues with peer-review rather than the cause. Some commentators are taking issue with the idea that LLMs even function in the way the author thinks they do. From our perspective, the way to address reviewer capacity issues is not to try to shortcut the review process but instead accept that high-quality reviews are hard to write and therefore we need to (a) publish less, (b) only pass to reviewers papers that are ready for review, and (c) ring-fence and otherwise pay for peer-review. https://lnkd.in/dnEt89HB
-
We have started experimenting with a monthly newsletter. The current format is: (1) a short think-piece on some aspect of publishing that affects us as a journal or informs our strategy; (2) highlighting a recently published paper; (3) highlighting a submitted preprint which we have evaluated, so you can look at the reviewer comments. Here is the first issue: https://lnkd.in/ewegWSry
-
If you are interested in how issues of trust and research integrity are addressed in our editorial policies, the Editor-in-Chief of Evidence-Based Toxicology Paul Whaley is part of this upcoming panel organised by Taylor & Francis Group.
I'll be chairing a Taylor & Francis Group panel discussion on the intersection of Open Research and Research Integrity (and what this means for "trust" in research) on Wednesday 5 March, please do sign up and join us! We have an excellent panel planned, including Paul Whaley, Allyson Lister, Matt Cannon and Coromoto Power Febres bringing their perspectives on reproducibility, transparency, policy, engagement and of course (!), the impact of AI. Everyone is welcome, join the conversation here: https://lnkd.in/evcp9wRu #openresearch #researchintegrity #openscience
-
At EBT we are very keen on reproducible methods. This is a really nice example from psychology of how one can go about testing and reporting the computational reproducibility of code. (This is not at all mandatory for submissions to us, but if an author wanted to do this we would be extremely supportive.)
?? ?????? ?????????????????????? ???????? ???????? ?????????????? The last paper I published (https://lnkd.in/e-xNawCv) includes this statement: "?????? ?????? ?????? ?????????????????? ???????? ???????? ??????? ?????????? ?????? ???????????? ?????????????????? ???? ?????????://??????.????/??2??3??. ????? '??????????????????????' ???????????????? ???????? ?????? ?????????????????????? ???????????????? ???? ???????? ???????????? ?????? ?????? ???? ?????????? ???????: ?????????://???????????.??????/?????????????????????/????????????????-?????. ??????? ?????????? ???????????????? ?? ???????????????? ?????????????????????????? ?????????????????????????????? (?????? ???????????????? ???? ????. 2024), ?????????????????? ???????????? ???? ?????????????????????? ?????????????????? ??????????????, ???????? ???? ???? ???????? ?????????????? ???????????????????? ???? ?????????????? ?????????????????????????????? (????????? ?????? ?????????? 2021)." There is one thing I thought might be worth sharing with my network, and it may be interesting to you as well: ?????? ???????? ???? ?????????????????????????????? ????????????????????????. What does that mean? Computational reproducibility means that if you provide your data and code exactly as they are, another researcher should be able to reproduce the exact results reported in your paper. I always assumed, "Of course! How could it be otherwise?" But actually, it’s not that easy. The first time I tried to reproduce my own work, I struggled to reproduce it myself. Later, when I was confident my work was computationally reproducible, I handed it to an independent code checker. It took them ages to figure out how to reproduce the results. I had to walk them through it step by step. A well-documented project should allow someone else to easily figure out how to get from your data to the figures, tables, and analysis results you reported. Fortunately, there are many great initiatives to help us with this. I used https://codecheck.org.uk, an initiative led by Stephen Eglen and Daniel Nüst. What’s better than having a volunteer examine your code and, if they can successfully reproduce your results, issue an official citable certificate? You can even become a code checker yourself and contribute to this growing community. Another great initiative I learned about is https://www.reprohack.org/, run by Daniela Gawehns and team. It’s especially helpful if you’re just getting started and want to practice with a colleague to check if your work is reproducible.??
-
-
Just another minor update, but I am working through the authors' revisions to a manuscript about machine-supported term curation in a research database and am just absolutely loving it. What a useful resource this will be. (Probably not to a massive number of people, but at least it'll be out there in case it's needed instead of sitting in someone's head or on their hard drive.) https://lnkd.in/e_kjDT7m
-
Just a minor update, but have had a revised manuscript submitted to EBT. As per our open review policies, you can already see the updated preprint on our Zenodo Community, including the response to reviewer comments. We will get to checking this over and potentially inviting reviewers to provide additional comments shortly. https://lnkd.in/e9NRMX6V
-
We have been vaguely threatening people with this for a couple of years, but we are *finally* going to launch a journal newsletter. You can sign up from the link below. ?? It will include brief topical observations on how the publishing industry functions and affects journals like EBT, links to new publications (obviously!), and links to preprints submitted to us and our evaluation reports of those submissions. https://lnkd.in/gprjDrDY
-
BIG NEWS! We are delighted to announce that we have been indexed in the Directory of Open Access Journals (https://doaj.org/). This is a big step forward for us as a new title. All we can say is thank you to everyone who has submitted their work to us, for their willingness to support a journal that is trying to do things a little differently, of course our reviewers (couldn't do it without you!), and the fantastic team at Taylor & Francis Group including Emma Coleman-Williams and Ian Challand who have been supporting us every step of the way. Onwards and upwards!
-
-
Our first publication of 2025, from Emily Senerth, MS, MPH, Giffe Johnson, PhD, Rebecca Morgan and colleagues. We commented on this when announcing its acceptance, but it is important work for showing convergence in #SystematicReview frameworks across different actors. So while 10 years ago a general consensus statement about what counts as a systematic review in environmental health and toxicology might not have been feasible, it is possible that today it could be done. This might be helpful for resolving any outstanding issues and disagreement (perceived or real) about ideal approaches to evidence synthesis in toxicology. https://lnkd.in/e9VP5Hbn