Skip to Main Content

Systematic Reviews for Health: 9. Pilot Search Strategy & Monitor Its Development

A guide on how a Research Librarian can help you during a systematic review process

Step 9. Pilot Search Strategy and Monitor Its Development

Developing a search that informs a systematic review is an iterative process.

Carry out the search as outlined in Step 7, entering each term from the Concept Table into the database and using the search history to combine the terms with AND and OR appropriately. You can easily spot mistakes by looking at the number of search results for each term. You can also identify which terms add value to the search. Keep an eye out for searches returning 0 results (they are not adding "value" or have a mistake) and searches returning a very high number of results (term may be too broad). If you decide to remove a term, make a note of why.

It is important to constantly look at the search results to determine whether the results are relevant. Ask yourself whether already identified key articles are being found by the search. If not, review these known articles for words in the title and abstract, author's keywords and, if available, controlled vocabulary terms. Revise your search strategy by adding any missing terms and re-run the search if necessary.

For more information on documenting your search, see the Documenting Search Strategies tab.

Evaluate the Results

After running your pilot search it is important that you evaluate the search results before translating the search for the other databases or commencing the screening process. You can do this by asking yourself these questions:

  • Did I find too many results?
  • Did I find too few results?
  • Are the search results mainly relevant?
  • Do the search results contain the key articles on the topic I already know about?

NOTE!  Developing your search is an iterative process. You usually have to modify your original search several times.

Did I find too many results?

There is no "good" or "right" number of search results. The number of search results will depend on how much research has been done on your topic.

If you find thousands of results, check whether the results are mainly relevant (see Relevant results?). If they are, you may like to make your question more specific and add an extra concept. 

Did I find too few results?

There is no "good" or "right" number of search results. The number of search results will depend on how much research has been done on your topic.

You may like to ask yourself these questions:

  • Did I include all alternative terms in my search?
  • Did I include all relevant controlled vocabulary terms? Is there a broader term available?
  • Was I too restrictive by using phrase searching?
  • Can I remove a concept from my search?
  • Can I remove a search filter?
  • Did I make any spelling mistakes?

Are the results I found relevant?

Every search will find some irrelevant results, it's inevitable. However if you find far too many irrelevant results, consider the following:

  • Does each concept only include synonyms and similar terms or did I use a term that is too broad?
  • Did I get the ORs and ANDs right?
  • Is there one term that consistently brings up irrelevant results? Can I remove that term and include a more specific one instead?

Do the search results contain the known key articles?

A good search should pick up the known key articles on your topic.

If not, check the the following:

  • In my key articles, what terms did the author use in the title, abstract and author keywords that describe my concepts? What controlled vocabulary terms did the databases assign to my key articles that are relevant to my concepts? Have I used these terms in my search?
  • Is the journal in which the key article has been published in indexed in the database I am searching?

Automation tool SearchRefinery

The SearchRefinery tool from the Systematic Review Accelerator can help refine the terms used in a search. You can upload a PubMed search and the PubMed IDs of your key articles and SearchRefinery visualises the search, displays the number of PubMed results, relevant citations to find, and relevant citations found. You can add and remove keywords and easily see the impact of these adjustments to the search.

Note!  You need to report on the use of automation tools as per the PRISMA guidelines.

Other Resources

This video from the The Medical Library at Yale University introduces techniques in validating, verifying and revising your searches:

You may like to use the PRESS (Peer Review of Electronic Literature Search Strategies) checklist to self-critique your search strategies. The aim of the PRESS 2015 Guideline Statement is to help guide and improve the peer review of search strategies.

McGowan, J, Sampson, M, Salzwedel, DM, Cogo, E, Foerster, V & Lefebvre, C 2016, 'PRESS peer review of electronic search strategies: 2015 guideline statement', Journal of Clinical Epidemiology, vol. 75, no. Supplement C, pp. 40-46. (Table 1, p42)

The University of Newcastle has developed a checklist for reviewing search strategies based on PRESS:

Example

This is the search history for our example search in Medline via Ovid (result numbers for search done on 16 October 2020), following Step 7. Each term is searched for separately and the individual searches are then combined with OR and AND using the Search History.

#

Searches

Results

1 Dementia.ab,kf,ti. 119329
2 Alzheimer.ab,kf,ti. 30558
3 "Huntington*".ab,kf,ti. 19155
4 Kluver.ab,kf,ti. 552
5 Lewy.ab,kf,ti. 10352
6 exp Dementia/ 167832
7 1 or 2 or 3 or 4 or 5 or 6 236562
8 (Animal adj3 therap*3).ab,kf,ti. 1773
9 (Animal adj3 activit*).ab,kf,ti. 3639
10 (Animal adj3 intervention*).ab,kf,ti. 892
11 (Pet adj3 therap*3).ab,kf,ti. 1324
12 (Dog adj3 therap*3).ab,kf,ti. 331
13 (Canine adj3 therap*3).ab,kf,ti. 499
14 Aquarium.ab,kf,ti. 2023
15 Animal Assisted Therapy/ 430
16 Pets/ 2583
17 Dogs/ 327033
18 Cats/ 135457
19 Birds/ 38348
20 Bonding, Human-Pet/ 1883
21 Animals, Domestic/ 15577
22 8 or 9 or 10 or 11 or 12 or 13 or 14 or 15 or 16 or 17 or 18 or 19 or 20 or 21 499538 
23 "music*".ab,kf,ti. 23820
24 singing.ab,kf,ti. 3489
25 sing.ab,kf,ti. 1366
26 "Auditory stimulat*".ab,kf,ti. 1987
27 Music/ 14161
28 Music Therapy/ 3582
29 Acoustic Stimulation/ 43715
30 Singing/ 897
31 23 or 24 or 25 or 26 or 27 or 28 or 29 or 30 73975
32 Aggression.ab,kf,ti. 31826
33 Neuropsychiatric.ab,kf,ti. 34382
34 Apathy inventory.ab,kf,ti. 37
35 Cornell scale.ab,kf,ti. 392
36 Cohen Mansfield.ab,kf,ti. 360
37 BEHAVE-AD.ab,kf,ti. 166
38 CERAD-BRSD.ab,kf,ti. 5
39 behavio?r.ab,kf,ti.     922413
40 exp Aggression/ 38628
41 exp Personality Inventory/ 36368
42 Psychomotor Agitation/ 5189
43 32 or 33 or 34 or 35 or 36 or 37 or 38 or 39 or 40 or 41 or 42 1019967
44 7 and 22 and 31 and 43 7

 

All of the search terms that were identified in Step 3 and Step 4, return some results and add value. The final search combining all four concepts only returned seven results. It's easier to scan these articles for a RCT study design than to add the methodological search filter for RCTs as another concept. Also, it's best to use the fewest concepts possible in a search when conducting a systematic review.

This is the search history for our example search in Medline via PubMed (result numbers for search done on 19 May 2017), following Step 7. Each term is searched for separately and the individual searches are then combined with OR and AND using the History and Query box.

Note!   PubMed's search history displays the searches in reverse chronological order (latest search on top).

Search

Query

Results

#48 #7 AND #24 AND #34 AND #47 10
#47 #35 OR #36 OR #37 OR #38 OR #39 OR #40 OR #41 OR #42 OR #43 OR #44 OR #45 OR #46 1,469,606
#46 Psychomotor agitation [mh] 6,642
#45 Personality inventory [mh] 36,769
#44 Aggression [mh] 40,623
#43 Behaviour* [tiab] 309,598
#42 Behavior* [tiab] 1,084,895
#41 CERAD-BRSD [tiab] 5
#40 BEHAVE-AD [tiab] 166
#39 Cohen Mansfield [tiab] 372
#38 Cornell scale [tiab] 396
#37 Apathy inventory [tiab] 34
#36 Neuropsychiatric [tiab] 35,245
#35 Aggression [tiab] 32,027
#34 #25 OR #26 OR #27 OR #28 OR #29 OR #30 OR #31 OR #32 OR #33 74,990
#33 Singing [mh] 1,050
#32 Acoustic Stimulation [mh] 44,925
#31 Music Therapy [mh] 3,867
#30 Music [mh] 15,005
#29 Auditory stimulat* [tiab] 1,956
#28 Sing [tiab] 1,400
#27 Singing [tiab] 3,552
#26 Music* [tiab] 23,586
#25 Music therapy [tiab] 2,692
#24 #8 OR #9 OR #10 OR #11 OR #12 OR #13 OR #14 OR #15 OR #16 OR #17 OR #18 OR #19 OR #20 OR #21 OR #22 OR #23 501,702
#23 Animals, Domestic [mh:noexp] 15,819
#22 Bonding, Human-Pet [mh] 1,955
#21 Birds [mh:noexp] 39,601
#20 Cats [mh] 137,266
#19 Dogs [mh] 332,958
#18 Pets [mh] 2,922
#17 Animal Assisted Therapy [mh:noexp] 489
#16 Aquarium [tiab] 2,029
#15 Canine-assisted therapy [tiab] 20
#14 Dog-assisted therapy [tiab] 33
#13 Dog therapy [tiab] 14
#12 Pet therapy [tiab] 164
#11 Animal therapy [tiab] 70
#10 Animal-assisted intervention* [tiab] 215
#9 Animal-assisted activit* [tiab] 81
#8 Animal-assisted therapy [tiab] 368
#7 #1 OR #2 OR #3 OR #4 OR #5 OR #6 287,217
#6 Dementia [mh] 179,610
#5 Lewy [tiab] 10,599
#4 Kluver [tiab] 552
#3 Huntington* [tiab] 19,152
#2 Alzheimer [tiab] 162,646
#1 Dementia [tiab] 121,679

 

All of the search terms that were identified in Step 3 and Step 4, return some results and add value. The final search combining all four concepts only returned seven results. It's easier to scan these articles for a RCT study design than to add the methodological search filter for RCTs as another concept. Also, it's best to use the fewest concepts possible in a search when conducting a systematic review.

Need More Help?
Book a consultation with a Learning and Research Librarian or contact Librarians@utas.edu.au.