Skip to Main Content

Systematic Reviews for Health: Use of AI

A guide on how a Research Librarian can help you during a systematic review process

Use of AI in Systematic Reviews

Evidence synthesis is built on the principles of rigor, transparency and replicability. They are also time-consuming and expensive, and struggle to keep up with funders and users needs. Artificial Intelligence (AI) has the potential to make the process faster and reduce costs, but it also brings challenges. Right now, many AI tools could undermine the very principles evidence synthesis relies on, risking a flood of unreliable, biased and low-quality reviews.

In recent years there has been rapid surge of new AI tools for systematic reviews and evidence syntheses, but they are emerging faster than they can be properly evaluated, leaving a limited evidence base to guide their use.

A recent paper evaluated different generative AI (genAI) tools for the different stages of reviews, and concluded that for the searching step genAI is currently non-promising, but is promising for the screening and data extraction steps.

Lieberum, JL, Toews, M, Metzendorf, MI, Heilmeyer, F, Siemens, W, Haverkamp, C, Böhringer, D, Meerpohl, JJ & Eisele-Metzger, A 2025, 'Large language models for conducting systematic reviews: on the rise, but not yet ready for use-a scoping review', Journal of Clinical Epidemiology, vol. 181, p. 111746.

 

Takeaway: AI should be used with human oversight: as a companion, not replacement!

 

Responsible AI in Evidence Synthesis (RAISE)

Responsible AI in evidence SynthEsis (RAISE) is an initiative focused on promoting the responsible development, evaluation and us of AI tools in evidence synthesis, aiming to establish best practice standards and build stronger evidence base for tool selection and application.

RAISE guidance includes three documents:


RAISE 1

In summary, these are the recommendations for reviewers:

  • Remain ultimately responsible for the evidence synthesis
  • Report AI use in your evidence synthesis manuscript transparently
  • Ensure ethical, legal and regulatory standards are adhered to when using AI
  • Contribute to the ecosystem to help all roles continue to develop and grow

More details can be found in RAISE 1.

 

 

 

New AI Methods Group

The newly formed AI Methods Group is a collaboration between four leading evidence synthesis organisations Cochrane Collaboration, JBI, Campbell Collaboration and the Collaboration for Environmental Evidence (CEE) focusing on AI and automation in evidence synthesis.

 

 

PRISMA and Reporting the Use of AI 

According to the PRISMA guidelines, authors should report what automation tool they used, how it was used, how it was trained, and details on the tool's performance and internal validation.

Need More Help?
Book a consultation with a Learning and Research Librarian or contact Librarians@utas.edu.au.