The false promise of firearms examination validation studies: Lay controls, simplistic comparisons, and the failure to soundly measure misidentification rates
Correction(s) for this article
-
Correction to: The false promise of firearms examination validation studies: Lay controls, simplistic comparisons, and the failure to soundly measure misidentification rates
- Volume 69Issue 5Journal of Forensic Sciences
- pages: 1937-1937
- First Published online: August 16, 2024
Corresponding Author
Richard E. Gutierrez JD
Forensic Science Division, Law Office of the Cook County Public Defender, Chicago, Illinois, USA
Academy Standards Board, Firearms and Toolmarks Consensus Body, Colorado Springs, Colorado, USA
Correspondence
Richard E. Gutierrez, Forensic Science Division, Law Office of the Cook County Public Defender, Chicago, Illinois, USA.
Email: [email protected]
Search for more papers by this authorEmily J. Prokesch JD
Discovery and Forensic Support Unit, New York State Defenders Association, Albany, New York, USA
Columbia School of Law, New York, New York, USA
Search for more papers by this authorCorresponding Author
Richard E. Gutierrez JD
Forensic Science Division, Law Office of the Cook County Public Defender, Chicago, Illinois, USA
Academy Standards Board, Firearms and Toolmarks Consensus Body, Colorado Springs, Colorado, USA
Correspondence
Richard E. Gutierrez, Forensic Science Division, Law Office of the Cook County Public Defender, Chicago, Illinois, USA.
Email: [email protected]
Search for more papers by this authorEmily J. Prokesch JD
Discovery and Forensic Support Unit, New York State Defenders Association, Albany, New York, USA
Columbia School of Law, New York, New York, USA
Search for more papers by this author[Correction added on May 20, 2024, after first online publication: The authors’ affiliations have been updated in this version.]
Co-authors Richard E. Gutierrez and Emily J. Prokesch contributed equally to this study.
Abstract
Several studies have recently attempted to estimate practitioner accuracy when comparing fired ammunition. But whether this research has included sufficiently challenging comparisons dependent upon expertise for accurate conclusions regarding source remains largely unexplored in the literature. Control groups of lay people comprise one means of vetting this question, of assessing whether comparison samples were at least challenging enough to distinguish between experts and novices. This article therefore utilizes such a group, specifically 82 attorneys, as a post hoc control and juxtaposes their performance on a comparison set of cartridge case images from one commonly cited study (Duez et al. in J Forensic Sci. 2018;63:1069–1084) with that of the original participant pool of professionals. Despite lacking the kind of formalized training and experience common to the latter, our lay participants displayed an ability, generally, to distinguish between cartridge cases fired by the same versus different guns in the 327 comparisons they performed. And while their accuracy rates lagged substantially behind those of the original participant pool of professionals on same-source comparisons, their performance on different-source comparisons was essentially indistinguishable from that of trained examiners. This indicates that although the study we vetted may provide useful information about professional accuracy when performing same-source comparisons, it has little to offer in terms of measuring examiners' ability to distinguish between cartridge cases fired by different guns. If similar issues pervade other accuracy studies, then there is little reason to rely on the false-positive rates they have generated.
CONFLICT OF INTEREST STATEMENT
The authors have no competing financial interests to declare.
Supporting Information
Filename | Description |
---|---|
jfo15531-sup-0001-AppendixS1.pdfPDF document, 890.7 KB |
Appendix S1: |
jfo15531-sup-0002-AppendixS2.xlsxExcel 2007 spreadsheet , 17.1 KB |
Appendix S2: |
Please note: The publisher is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.
REFERENCES
- 1 President's Council of Advisors on Science and Technology. Report to the President—Forensic science in criminal courts: ensuring scientific validity of feature-comparison methods. 2016. https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/PCAST/pcast_forensic_science_report_final.pdf. Accessed 15 Mar 2024
- 2 AFTE. Theory of identification as it relates to toolmarks. AFTE J. 2011; 43(4): 287.
- 3Monson KL, Smith ED, Bajic SJ. Planning, design and logistics of a decision analysis study: the FBI/Ames study involving forensic firearms examiners. Forensic Sci Int Synerg. 2022; 4:100221. https://doi.org/10.1016/j.fsisyn.2022.100221
- 4Hamby JE. The history of firearm and toolmark identification. AFTE J. 1999; 31(3): 266–284.
- 5Garrett BL, Scurich N, Tucker E, Bloom H. Judging firearms evidence and the Rule 702 Amendments. Judicature. 2023; 107(2): 41–49.
- 6 Committee on Identifying the Needs of the Forensic Sciences Community, National Research Council. Strengthening forensic science in the United States: a path forward. Washington, DC: National Academies Press; 2009.
- 7 Human Factors Committee, Organization of Scientific Area Committees for Forensic Science. Human factors in validation and performance testing of forensic science. 2020. OSAC Technical Series 0004. https://www.nist.gov/system/files/documents/2023/10/26/OSACTechSeriesPub_HF%20in%20Validation%20and%20Performance%20Testing%20of%20Forensic%20Science_March2020.pdf. Accessed 14 Mar 2023
- 8Dror IE, Scurich N. (mis)use of scientific measurements in forensic science. Forensic Sci Int Synerg. 2020; 2: 333–338. https://doi.org/10.1016/j.fsisyn.2020.08.006
- 9Hofmann H, Carriquiry A, Vanderplas S. Treatment of inconclusive results in firearms error rate studies. Law Probab Risk. 2020; 19: 317–364. https://doi.org/10.1093/lpr/mgab002
- 10Brundage DJ. The identification of consecutively rifled gun barrels. AFTE J. 1998; 30(3): 438–444.
- 11 Response to the declaration regarding firearms and toolmark error rates filed in Illinois v. Winfield. Washington, DC: Federal Bureau of Investigation; 2022.
- 12 United States Department of Justice Statement on the PCAST report. Forensic science in criminal courts: ensuring scientific validity of feature-comparison methods. Washington, DC: United States Department of Justice; 2021. https://www.justice.gov/olp/page/file/1352496/dl. Accessed 14 Mar 2023
- 13Baldwin D, Bajic S, Morris M, Zamzow D. A study of false-positive and false-negative error rates in cartridge case comparisons. Ames, IA: Ames Laboratory, U.S. Department of Energy; 2014. Report No. IS-5207.
10.21236/ADA611807 Google Scholar
- 14Monson KL, Smith ED, Peters EM. Accuracy of comparison decisions by forensic firearms examiners. J Forensic Sci. 2022; 68(1): 86–100. https://doi.org/10.1111/1556-4029.15152
- 15Best BA, Gardner EA. An assessment of the foundational validity of firearms identification using ten consecutively button-rifled barrels. AFTE J. 2022; 54(1): 28–37.
- 16Keisler MA, Hartman S, Kilmon A, Oberg M, Templeton M. Isolated pairs research study. AFTE J. 2018; 50(1): 56–58.
- 17Guyll M, Madon S, Yang Y, Burd KA, Wells G. Validity of forensic cartridge-case comparisons. Proc Natl Acad Sci USA. 2023; 120(20):e2210428120. https://doi.org/10.1073/pnas.2210428120
- 18Agar J. The admissibility of firearms and toolmarks expert testimony in the shadow of PCAST. Bayl Law Rev. 2022; 74(1): 93–196.
- 19Faigman DL, Scurich N, Albright T. The field of firearms forensics is flawed. Sci Am. 2022. https://www.scientificamerican.com/article/the-field-of-firearms-forensics-is-flawed/. Accessed 15 Mar 2024
- 20Dorfman AH, Valliant R. Inconclusives, errors, and error rates in forensic firearms analysis: three statistical perspectives. Forensic Sci Int Synerg. 2022; 5:100273. https://doi.org/10.1016/j.fsisyn.2022.100273
- 21Koehler JJ. How trial judges should think about forensic science evidence. Judicature. 2018; 102(1): 28–38.
- 22Albright TD. How to make better forensic decisions. PNAS. 2022; 119(38):e2206567119. https://doi.org/10.1073/pnas.2206567119
- 23Khan K, Carriquiry A. Shining a light on forensic black-box studies. Stat Public Policy. 2023; 10(1):2216748. https://doi.org/10.1080/2330443X.2023.2216748
- 24Sinha M, Gutierrez RE. Signal detection theory fails to account for real-world consequences of inconclusive decisions. Law Probab Risk. 2022; 21(2): 131–135. https://doi.org/10.1093/lpr/mgad001
- 25Arkes HR, Koehler JJ. Inconclusives and error rates in forensic science: a signal detection theory approach. Law Probab Risk. 2022; 20: 153–168. https://doi.org/10.1093/lpr/mgac005
- 26Arkes HR, Koehler JJ. Inconclusive conclusions in forensic science: rejoinders to Scurich, Morrison, Sinha and Gutierrez. Law Probab Risk. 2022; 21: 175–177. https://doi.org/10.1093/lpr/mgad002
- 27 United States v. Tibbs. Superior Court of the District of Columbia. No. 2016 CF1 19431, 2019 D.C. Super. LEXIS 9. 2019.
- 28 New York v. Ross & A.M. Supreme Court of New York, County of Bronx: Ind No. 267/2018, 129 N.Y.S.3d 629. 2020.
- 29 United States v. Adams. Federal District Court of Oregon: 444 F.Supp.3d 1248. 2020.
- 30 Illinois v. Winfield. Circuit Court of Cook County: 15CR14066-01, 16CR13911-013, 17CR02660-01. 2023.
- 31 Abruquah v. Maryland. Supreme Court of Maryland: 483 Md. 637, 296 A.3d 961. 2023.
- 32 Oregon v. Moore. Circuit Court for the State of Oregon: No. 18CR77176. 2023.
- 33Weiss DJ, Shanteau J. Empirical assessment of expertise. Hum Factors. 2003; 45(1): 104–116.
- 34Ericsson KA, Charness N. Expert performance: its structure & acquisition. Am Psychol. 1994; 49(8): 725–747. https://doi.org/10.1037/0003-066X.49.8.725
- 35Garrett BL, Mitchell G. The proficiency of experts. Univ Pa Law Rev. 2018; 66: 901–960.
- 36Butler J. Validation overview. NIST DNA analysis webinar series: validation concepts and resources—part I. Gaithersburg, MD: National Institute of Standards and Technology; 2014. https://www.nist.gov/system/files/documents/forensics/01_ValidationWebinar-Butler-Aug2014.pdf. Accessed 15 Mar 2024
- 37Kellman PJ, Mnookin JL, Erlikhman G, Garrigan P, Ghose T, Mettler E, et al. Forensic comparison and matching of fingerprints: using quantitative image measures for estimating error rates through understanding and predicting difficulty. PLoS One. 2014; 9(5):e94617. https://doi.org/10.1371/journal.pone.0094617
- 38 National Assessment Governing Board, U.S. Department of Education. Science framework for the 2015 National Assessment of Educational Progress. 2014 https://www.nagb.gov/content/dam/nagb/en/documents/publications/frameworks/science/2015-science-framework.pdf. Accessed 15 Mar 2024
- 39Jones DS, Podolsky SH. The history and fate of the gold standard. Lancet. 2015; 385(9977): 1502–1503. https://doi.org/10.1016/S0140-6736(15)60742-5
- 40 Adequate and well controlled studies, 45 CFR § 314.126. United States: Department of Health and Human Services; 2002.
- 41Whiteford H, Harris M, McKeon G, Baxter A, Pennell C, Barendregt J, et al. Estimating remission from untreated major depression: a systematic review and meta-analysis. Psychol Med. 2013; 43(8): 1569–1585. https://doi.org/10.1017/S0033291712001717
- 42Langenburg G. A critical analysis and study of the ACE-V process [Doctoral dissertation]. Lausanne, Switzerland: University of Lausanne; 2012.
- 43 National Institute of Forensic Science of the Australia New Zealand Policing Advisory Agency. Empirical study design in forensic science: a guideline to forensic fundamentals. 2019. https://www.anzpaa.org.au/nifs/publications/general. Accessed 15 Mar 2024
- 44Tangen JM, Thompson MB, McCarthy DJ. Identifying fingerprint expertise. Psychol Sci. 2011; 22(8): 995–997. https://doi.org/10.1177/0956797611414729
- 45Thompson MB, Tangen JM, McCarthy DJ. Expertise in fingerprint identification. J Forensic Sci. 2013; 58(6): 1519–1530. https://doi.org/10.1111/1556-4029.12203
- 46Thompson MB, Tangen JM, McCarthy DJ. Human matching performance of genuine crime scene latent fingerprints. Law Hum Behav. 2014; 38(1): 84–93. https://doi.org/10.1037/lhb0000051
- 47Thompson MB, Tangen JM. The nature of expertise in fingerprint matching: experts can do a lot with a little. PLoS One. 2014; 9(12):e114759. https://doi.org/10.1371/journal.pone.0114759
- 48Phillips PJ, Yates AN, Hu Y, Hahn CA, Noyes E, Jackson K, et al. Face recognition accuracy of forensic examiners, superrecognizers, and face recognition algorithms. Proc Natl Acad Sci USA. 2018; 115(24): 6171–6176. https://doi.org/10.1073/pnas.1721355115
- 49White D, Phillips PJ, Hahn CA, Hill M, O'Toole AJ. Perceptual expertise in forensic facial image comparison. Proc Biol Sci. 2015; 282:20151292. https://doi.org/10.1098/rspb.2015.1292
- 50Sita J, Found B, Rogers DK. Forensic handwriting examiners' expertise for signature comparison. J Forensic Sci. 2002; 47(5): 1117–1124. https://doi.org/10.1520/JFS15521J
- 51Max B, Cavise J, Gutierrez RE. Assessing latent print proficiency tests: lofty aims straightforward samples, and the implications of non-expert performance. J Forensic Identif. 2019; 69: 281–298.
- 52Hamby JE, Brundage DJ, Thorpe JW. The identification of bullets fired from 10 consecutively rifled 9mm Ruger pistol barrels: a research project involving 507 participants from 20 countries. AFTE J. 2009; 41(2): 99–110.
- 53Duez P, Weller T, Brubaker M, Hockensmith RE, Lillien R. Development and validation of a virtual examination tool for firearm forensics. J Forensic Sci. 2018; 63(4): 1069–1084. https://doi.org/10.1111/1556-4029.13668
- 54 Girardot v. United States. District of Columbia Court of Appeals: 92 A.3d 1107. 2014.
- 55 Illinois v. King. Illinois Supreme Court: 2020 IL 123926. 2020.
- 56Brown H. Eight gates for expert witnesses. Houst Law Rev. 1999; 36(2): 743–882.
- 57Lawson TF. Can fingerprints lie?: Re-weighing fingerprint evidence in criminal jury trials. Am J Crim Law. 2003; 31(1): 1–66.
- 58 Rule 702. Federal rule of evidence; House of Representatives—The Committee on the Judiciary. Washington, DC: U.S. Government Printing Office; 2011.
- 59Luby AS, Kadane JB. Proficiency testing of fingerprint examiners with Bayesian Item Response Theory. Law Probab Risk. 2018; 17(2): 111–121. https://doi.org/10.1093/lpr/mgy009
- 60Mattijssen EJAT, Witteman CLM, Berger CEH, Brand NW, Stoel RD. Validity and reliability of forensic firearm examiners. Forensic Sci Int. 2020; 307:110112. https://doi.org/10.1016/j.forsciint.2019.110112
- 61Zheng X, Soons J, Thompson R, Singh S, Constantin C. NIST ballistics toolmark research database. J Res Natl Inst Stand Technol. 2020; 125:125004. https://doi.org/10.6028/jres.125.004
- 62Law EF, Morris KB. Evaluating firearm examiner conclusion variability using cartridge case reproductions. J Forensic Sci. 2021; 66(5): 1704–1720. https://doi.org/10.1111/1556-4029.14758
- 63 Transcript of Proceedings, Testimony of Todd Weller. United States v. Harris. United States District Court for the District of Columbia: No. 19-cr-358. 2020.
- 64Nichols RG. Declaration to Sharon Donovan, United States Attorney's Office. 2023.
- 65 Association of Firearms and Toolmarks Examiners. AFTE range of conclusions. Chicago: Association of Firearms and Toolmarks Examiners; 2020. https://afte.org/about-us/what-is-afte/afte-range-of-conclusions. Accessed 15 Mar 2024
- 66 National Association of Criminal Defense Lawyers – Cardozo Law. NACDL/Cardozo Law National Forensic College Application. 2022. https://www.nacdl.org/getattachment/ab77e39d-c306-4945-833c-e9b96ac17a86/private-attorney-nfc-2019.pdf. Accessed 15 Mar 2024
- 67 General requirements for informed consent, 45 CFR § 46.116. United States: Department of Health and Human Services; 2007.
- 68 Protection of human subjects, 28 CFR § 46. United States: Department of Justice; 2016.
- 69Oppenheimer DM, Meyvis T, Davidenko N. Instructional manipulation checks: detecting satisficing to increase statistical power. J Exp Soc Psychol. 2009; 45(4): 867–872. https://doi.org/10.1016/j.jesp.2009.03.009
- 70 Organization of Scientific Area Committees for Forensic Science, Firearms and Toolmarks Subcommittee. Response to the President's Council of Advisors on Science and Technology (PCAST) call for additional references regarding its report. https://www.nist.gov/organization-scientific-area-committees-forensic-science/firearms-toolmarks-subcommittee. Accessed 15 Mar 2024
- 71 Scientific Working Group for Firearms and Toolmarks. Systemic requirements/recommendations for the forensic firearm and toolmark laboratory. https://www.nist.gov/organization-scientific-area-committees-forensic-science/firearms-toolmarks-subcommittee. Accessed 15 Mar 2024
- 72 Scientific Working Group for Firearms and Toolmarks. Minimum qualifications for experienced firearm and toolmark examiners. 2006. https://www.nist.gov/organization-scientific-area-committees-forensic-science/firearms-toolmarks-subcommittee. Accessed 15 Mar 2024
- 73 ANSI National Accreditation Board. Directory of accredited labs. https://search.anab.org/. Accessed 28 Sept 2023
- 74 US v. Rhodes. United States District Court for the District of Oregon: No. 3:19-cr-00333-MC, 2023 U.S. Dist. LEXIS 7528. 2023.
- 75 US v. Hunt. United States Court of Appeals for the Tenth Circuit: 63 F.4th 1229. 2023.
- 76 Scientific Working Group for Firearms and Toolmarks. Guidelines for the standardization of comparison documentation. 2008. https://www.nist.gov/organization-scientific-area-committees-forensic-science/firearms-toolmarks-subcommittee. Accessed 15 Mar 2024
- 77Moran B. Photo documentation on toolmark examinations—an argument in support. AFTE J. 2003; 35(2): 174–189.
- 78Pauw-Vugts P, Walters A, Oren L, Pfoser L. FAID2009: proficiency test and workshop. AFTE J. 2013; 45(2): 115–127.
- 79Murrie D, Boccaccini MT, Guarnera LA, Rufino KA. Are forensic experts biased by the side that retained them? Psychol Sci. 2013; 4: 1889–1897. https://doi.org/10.1177/0956797613481812
10.1177/0956797613481812 Google Scholar
- 80Smith A, Wells GL. Telling us less than what they know: expert inconclusive reports conceal exculpatory evidence in forensic cartridge-case comparisons. J Appl Res Mem Cogn Adv. 2023; 13: 147–152. https://doi.org/10.1037/mac0000138
- 81Hare E, Hofmann H, Carriquiry A. Automatic matching of bullet land impressions. Ann Appl Stat. 2017; 11(4): 2332–2356. https://doi.org/10.1214/17-AOAS1080
- 82 United States v. Harris. United States District Court for the District of Columbia: 502 F. Supp. 3d 28. 2020.
- 83 United States v. Chavez. United States District Court for the Northern District of California: No. 15-CR-00285-LHK-1, 2021 U.S. Dist. LEXIS 237830. 2021.
- 84Knapp J, Garvin A. Consecutively manufactured 25 AUTO F.I.E. barrels—a validation study. AFTE 43rd Annual Training Seminar, Buffalo, NY. 2012. https://afte.org/store/product/training-seminar-dvd-afte-2012-buffalo-ny. Accessed 15 Mar 2024
- 85Eldridge H, de Donno M, Champod C. Testing the accuracy and reliability of palmar friction ridge comparisons—a black box study. Forensic Sci Int. 2021; 318:110457. https://doi.org/10.1016/j.forsciint.2020.110457
- 86 U.S. Food and Drug Administration. Statistical guidance on reporting results from studies evaluating diagnostic tests. Washington, DC: U.S. Department of Health and Human Services, Food and Drug Administration, Center for Devices and Radiological Health, Diagnostic Devices Branch, Division of Biostatistics, Office of Surveillance and Biometrics; 2007.
- 87 Clinical and Laboratory Standards Institute. EP12-Ed3, evaluation of qualitative, binary output examination performance. Malvern, PA: CSLI; 2023.
- 88Bunch S, Murphy D. A comprehensive validity study for the forensic examination of cartridge cases. AFTE J. 2003; 35(2): 201–203.
- 89Hamby JE, Norris S, Petraco ND. Evaluation of GLOCK 9 mm firing pin aperture shear mark individuality based on 1,632 different pistols by traditional pattern matching and IBIS pattern recognition. J Forensic Sci. 2016; 61(1): 170–176. https://doi.org/10.1111/1556-4029.12940
- 90Stroman A. Empirically determined frequency of error in cartridge case examinations using a declared double-blind format. AFTE J. 2014; 46(2): 157–175.
- 91Knowles L, Hockey D, Marshall J. The validation of 3D virtual comparison microscopy (VCM) in the comparison of expended cartridge cases. J Forensic Sci. 2022; 67(2): 516–523. https://doi.org/10.1111/1556-4029.14942
- 92Page M, Taylor J, Blenkin M. Uniqueness in the forensic identification sciences—fact or fiction? Forensic Sci Int. 2011; 206(1–3): 12–18. https://doi.org/10.1016/j.forsciint.2010.08.004
- 93Scurich N, Stern H. Commentary on: Monson KL, Smith ED, Peters EM. Accuracy of comparison decisions by forensic firearms examiners.. J Forensic Sci. 2023; 68(3): 1093–1094. https://doi.org/10.1111/1556-4029.15258
- 94 Organization of Scientific Area Committees for Forensic Science, Firearms and Toolmarks Subcommittee. Response to the President's Council of Advisors on Science and Technology's (PCAST) request for information. 2015. https://www.nist.gov/organization-scientific-area-committees-forensic-science/firearms-toolmarks-subcommittee. Accessed 15 Mar 2024
- 95 Committee to Assess the Feasibility, Accuracy, and Technical Capability of a National Ballistics Database, National Research Council. Ballistic imaging. Washington, DC: National Academies Press; 2008.
- 96Nichols R. Subclass characteristics: from origin to evaluation. AFTE J. 2018; 50(2): 68–88.
- 97Rivera GC. Subclass characteristics in Smith & Wesson SW40VE sigma pistols. AFTE J. 2007; 39(3): 247–253.
- 98Bonfanti MS, de Kinder J. The influence of manufacturing processes on the identification of bullets and cartridge cases—a review of the literature. Sci Justice. 1999; 39(1): 3–10. https://doi.org/10.1016/s1355-0306(99)72008-3
- 99Biasotti A. A statistical study of the individual characteristics of fired bullets. J Forensic Sci. 1959; 4(1): 34–50.
- 100Miller J, McLean M. Criteria for identification of toolmarks. AFTE J. 1998; 30(1): 15–61.
- 101Miller J. Criteria for identification of toolmarks: part II—single land impression comparisons. AFTE J. 2000; 32(2): 116–131.
- 102Dror IE, Mnookin JL. The use of technology in human expert domains: challenges and risks arising from the use of automated fingerprint identification systems in forensic science. Law Probab Risk. 2010; 9(1): 47–67. https://doi.org/10.1093/lpr/mgp031
10.1093/lpr/mgp031 Google Scholar
- 103Eldridge E, de Donno M, Girod M, Champod C. Coping with close non-matches in latent print comparison (re-)training. Washington, DC: Office of Justice Programs' National Criminal Justice Reference Service; 2022. Grant No. 2018-DU-BX-0227. https://www.ojp.gov/library/publications/coping-close-non-matches-latent-print-comparison-re-training. Accessed 15 Mar 2024
- 104Masson JJ. Confidence level variations in firearms identification through computerized technology. AFTE J. 1997; 29(1): 42–44.
- 105 Organization of Scientific Area Committees for Forensic Science, Firearms and Toolmarks Subcommittee. Research needs assessment form, assessment of examiners' toolmark categorization accuracy. 2021. https://www.nist.gov/organization-scientific-area-committees-forensic-science/osac-research-and-development-needs. Accessed 15 Mar 2024. https://www.nist.gov/organization-scientific-area-committees-forensic-science/osac-research-and-development-needs. Accessed 30 Sept 2023
- 106Gutierrez RE, Addyman C. Commentary on: Monson KL, Smith ED, Peters EM. Accuracy of comparison decisions by forensic firearms examiners.. J Forensic Sci. 2023; 68(3): 1097–1101. https://doi.org/10.1111/1556-4029.15257
- 107Kam M, Fielding G, Conn R. Writer identification by professional document examiners. J Forensic Sci. 1997; 42(5): 778–786. https://doi.org/10.1520/JFS14207J
- 108Koehler JJ, Liu S. Fingerprint error rate on close non-matches. J Forensic Sci. 2021; 66(1): 129–134. https://doi.org/10.1111/1556-4029.14580
- 109Garrett BL, Cooper GS, Beckham Q. Forensic science in legal education. J Law Educ. 2022; 51(1): 1–12.
- 110Sanger RM. Forensics: educating the lawyers. J Leg Prof. 2019; 43(2): 221–250.
- 111Loudon-Brown M. Garbage in, garbage out: revising Strickland as applied to forensic science evidence. Ga State Univ Law Rev. 2018; 34(4): 893–914.
- 112Neufeld PJ. The (near) irrelevance of Daubert to criminal justice and some suggestions for reform. Am J Public Health. 2005; 95(Suppl 1): S107–S113. https://doi.org/10.2105/AJPH.2004.056333
- 113Despodova NM, Kukucka J, Hiley A. Can defense attorneys detect forensic confirmation bias? Effects on evidentiary judgments and trial strategies. Z Psychol. 2020; 228(3): 216–220. https://doi.org/10.1027/2151-2604/a000414