While systematic reviews (SRs) are often perceived as a “gold standard” for evidence synthesis in environmental health and toxicology, the methodological rigour with which they are currently being conducted is unclear. The objectives of this study are (1) to provide up-to-date information about the methodological rigour of environmental health SRs and (2) to test hypotheses that reference to a pre-published protocol, use of a reporting checklist, or being published in a journal with a higher impact factor, are associated with increased methodological rigour of a SR. A purposive sample of 75 contemporary SRs were assessed for how many of 11 recommended SR practices they implemented. Information including search strategies, study appraisal tools, and certainty assessment methods was extracted to contextualise the results. The included SRs implemented a median average of 6 out of 11 recommended practices. Use of a framework for assessing certainty in the evidence of a SR, reference to a pre-published protocol, and characterisation of research objectives as a complete Population-Exposure-Comparator-Outcome statement were the least common recommended practices. Reviews that referenced a pre-published protocol scored a mean average of 7.77 out of 10 against 5.39 for those that did not. Neither use of a reporting checklist nor journal impact factor was significantly associated with increased methodological rigour of a SR. Our study shows that environmental health SRs omit a range of methodological components that are important for rigour. Improving this situation will require more complex, comprehensive interventions than simple use of reporting standards.