Digital morphology analyzers in hematology: ICSH review and recommendations
Funding information
This study was financially supported by the ICSH. The ICSH is a not-for-profit organization sponsored by unrestricted educational grants from its corporate and affiliate members who are listed in the attached in Data S1.”
Abstract
Introduction
Morphological assessment of the blood smear has been performed by conventional manual microscopy for many decades. Recently, rapid progress in digital imaging and information technology has led to the development of automated methods of digital morphological analysis of blood smears.
Methods
A panel of experts in laboratory hematology reviewed the literature on the use of digital imaging and other strategies for the morphological analysis of blood smears. The strengths and weaknesses of digital imaging were determined, and recommendations on improvement were proposed.
Results
By preclassifying cells using artificial intelligence algorithms, digital image analysis automates the blood smear review process and enables faster slide reviews. Digital image analyzers also allow remote networked laboratories to transfer images rapidly to a central laboratory for review, and facilitate a variety of essential work functions in laboratory hematology such as consultations, digital image archival, libraries, quality assurance, competency assessment, education, and training. Different instruments from several manufacturers are available, but there is a lack of standardization of staining methods, optical magnifications, color and display characteristics, hardware, software, and file formats.
Conclusion
In order to realize the full potential of Digital Morphology Hematology Analyzers, pre-analytic, analytic, and postanalytic parameters should be standardized. Manufacturers of new instruments should focus on improving the accuracy of cell preclassifications, and the automated recognition and classification of pathological cell types. Cutoffs for grading morphological abnormalities should depend on clinical significance. With all current devices, a skilled morphologist remains essential for cell reclassification and diagnostic interpretation of the blood smear.
1 INTRODUCTION
Morphological analysis of the Romanowsky-stained peripheral blood smear is an essential element of hematological disease diagnosis.1 The morphological appearance of leukocytes, red blood cells, and platelets can be crucial in identifying certain hematological disease states.
Traditionally, morphological analysis of the blood smear has been performed by manual light microscopy. Although this method is the gold standard, it has the disadvantages of being labor-intensive, requiring continuous training of personnel, and being subject to relatively large interobserver variability.2 Moreover, there are space constraints for the storage of conventional glass slides and it is difficult to access secondary consultations using the classical manual microscopy method. Thus, there is an increasing demand for the development of digital microscopy systems, capable of performing morphological analysis of blood smears in an automated manner3-7
Initial developments in this field date back to the 1960s. The CELLSCAN,8 the Hematrak,9 and the Cydac Scanning Microscope System (Cydac)10 were among the first automated morphological analysis systems introduced. Other systems followed, such as the Leukocyte Automatic Recognition Computer (LARC; Corning Medical, Raleigh, North Carolina), the Coulter Diff3 and Diff4 (Coulter S-Plus WBC histogram, Coulter Electronics), and the ADC500 (Abbott Laboratories). However, these early systems did not provide significant improvements in workflow, as they were relatively slow and displayed limited degrees of automation.11
In the 1980s, the field of laboratory hematology diagnostics of blood samples advanced significantly with the introduction of flow-based cell counters. These cell counters were capable of subclassifying leukocytes and detecting abnormalities in leukocytes, red blood cells, and platelets using instrument flags.12 Although these platforms progressed through many improvements and have become an integral part of laboratory hematology diagnostics today, the need for morphological review of blood smears still remains.
More recently, more advanced automated digital microscopy systems have been developed and introduced to the laboratory hematology market: (a) CellaVision®DM96, CellaVision®DM1200, CellaVision®DM9600 (CellaVision AB); (b) Sysmex DI-60 (Sysmex Corp. Kobe, Japan/Based on the CellaVision DM1200 platform); (c) Vision Hema (West Medica); (d) EasyCell (Medica Corporation); (e) Nextslide (Nextslide Imaging LLC); (f) Cobas m511 (Roche Diagnostics); and HemaCAM (Fraunhofer Institute for Integrated Circuits IIS). To date, most peer-reviewed studies have been performed using the CellaVision and Sysmex systems.
Most of these systems make use of a digital camera coupled to a computer system. Digital images of individual cells are then used as input material for a computer-aided classification based on blood cell image analysis parameters such as geometric, color, and texture features.13
2 DETERMINATION OF THE STRENGTHS AND WEAKNESSES OF AVAILABLE PLATFORMS
The PubMed search engine was used for a literature review to search for “Digital Imaging” and “CellaVision,” concentrating on peer-reviewed articles. The retrieved articles were then used to identify additional publications in the field. We also discussed new instruments and developments with thought leaders in the field at scientific meetings. Based on this information, we established a database of peer-reviewed articles and digital imaging instruments on the market.
2.1 DM96, DM1200, and DM9600 CellaVision AB, (Lund, Sweden)
The DM Systems (further referred as CellaVision) (CellaVision AB) are automated digital image analyzers that locate cells in a blood smear, capture images of the cells, preclassify the cells using image analysis software, and then display the images on a computer screen. The analyzer scans a portion of a microscopy slide and automatically identifies an appropriate analysis area (monolayer) in which it locates and captures X50 oil (red blood cells, RBC) or X100 oil (white blood cells, WBC) images of cells on a blood smear.4 The software preclassifies WBCs to give an automated differential count of peripheral blood smears or body fluid cytospin slides and precharacterizes RBC by morphological abnormalities. Cell type recognition is based on an artificial neural network, which analyzes the digital images and preclassifies the cells. The analyzer requires a skilled morphologist to review all cells and to reclassify all unidentified cells before it is possible to release the results.
The basic underlying technology of the Diffmaster Octavia (introduced in 2001) which was the predecessor to the CellaVision series of hematology instruments has been summarized by Ron Hagner and Peter Wilson.14 Since its introduction in 2003, the platform has been extensively evaluated in the peer-reviewed literature.5, 6
2.1.1 WBC Differential
Kratz et al documented that preclassification of WBCs was most accurate for segmented neutrophils (92.5% correctly classified), lymphocytes (96.4%), and monocytes (81.4%); the preclassification assignments were similarly reproducible for basophils, but not for eosinophils (63.2%).4 Technologists agreed with 82% of the instrument's classifications. Briggs et al6 similarly reported poor detection of immature granulocytes, eosinophils, and basophils; they reported that differentials were performed faster with the CellaVision than with manual microscopy. Briggs et al reported that the overall preclassification accuracy of the CellaVision was 89.2%. The authors noted that the subjective classification of immature granulocytes into an earlier or later maturity class is also a human failing. Cornet et al15 and Billard et al16 highlighted a failure to identify lymphoblasts. Rollins-Raval et al found that depending on the patient population, the accuracy of the CV was between 87.2% and 95.4%. They noted that the CellaVision had difficulty in reliably identifying immature granulocytes, plasma cells, and blasts.17 Eilertsen et al18 used generalized linear mixed models to determine the effects of pre- and reclassifications as well as intersmear and between-technologist variation in blast counts. They found a significant difference between blast counts of preclassification vs reclassification (P = 0.009). There were no significant differences between duplicate smears (P = 0.621) or between technologists (P = 0.542). The preclassification feature alone was sufficient to verify the absence of blasts in samples flagged by the Sysmex XE-5000 autoanalyzer. In contrast, Park et al19 found that detection of basophils and band neutrophils was poor, but blast detection showed high sensitivity and specificity. In their timing and performance study for leukopenic samples, the authors established that by examining an additional 90 cells/slide, they could achieve a more reliable differential count with an additional time of approximately 40 seconds spent on each slide.
- The authors acknowledged that their selection of abnormal blood smears was not random; smears were selected to include examples of white cell abnormalities.
- The technologists at each participating laboratory only analyzed the slides from their own hospital sites. As each laboratory only had 2 to 3 technologists, there was a high probability of the same technologist manually reading the slide and also reclassifying the DM96 results.
- The study, similar to many other reports, focused on the reclassified cell results rather than the automated preclassified results. Although this is how the DM96 will be used in clinical practice, the resultant studies ultimately assess the ability of technologists to identify cells on a computer screen rather than test the ability of the DM96 algorithms to classify cells. Therefore, the study may be unable to test the possibility of full automation of PBS analysis.
Stouten et al analyzed 6,945 blood smears with approximately 1.4 million cells. They concluded that the CellaVision has excellent performance for the five “normal” WBC classes, as well as for blasts.21 In another study, Riedl et al placed CellaVision instruments in each of four different clinical laboratories in the Netherlands.22 They were able to demonstrate that the instruments yielded reproducible preclassification results for neutrophils, lymphocytes, monocytes, and eosinophils. Blast cells were also identified correctly. Basophil classifications showed a higher variation due to the low number of these cells present in the samples.
Amundsen et al23 reported that the neutrophil count of the CellaVision compared very well with a flow cytometric standard, even at very low neutrophil counts. Buoro et al24 compared monocyte enumeration by an automated hematology analyzer, flow cytometry, optical microscopy, and a digital morphology analyzer. They found that optical microscopy or digital image analysis did not achieve a satisfactory level of accuracy for enumerating monocytes, which are usually very low in peripheral blood.
Vaughan et al25 performed an evaluation of CellaVision in Africa. Of the 149 samples reviewed, 53% of samples were positive for HIV. Pre- and postclassification results from the DM96 were compared to manual differential counts. Pre- and postclassification accuracies were similar to the results from developed countries. Reclassification was needed for 16% of cells. Misclassifications were high for eosinophils (32%), blasts (34%), and basophils (94%).
2.1.2 Red blood cell morphology and the CellaVision ARBCA
The CellaVision ARBCA (CellaVision Advanced Red Blood Cell Application) was developed using an artificial neural network. It considers a large number of features, including size, roundness, size and shape of central pallor, and distribution of notches around the border for the classification of RBC.26 It groups RBCs into 21 morphological categories based on color, size, shape, and inclusions, including anisocytosis, micro- and macrocytosis, polychromasia and hypochromasia, poikilocytosis, target cells, schistocytes, helmet cells, sickle cells, spherocytes, elliptocytes, ovalocytes, teardrop cells, stomatocytes, acanthocytes, and echinocytes. Inclusions of Howell-Jolly bodies, Pappenheimer bodies, and parasites are also classified.
In a recent paper, Huisjes et al27 analyzed samples from 90 patients with hereditary hemolytic anemia and from 32 controls with the CellaVision ARBCA to improve and standardize the diagnosis of these disorders. They found the DM96 to be a useful screening tool for the detection and quantification of distinct morphological changes specific for various forms of hereditary hemolytic anemia.
Hervent et al compared the use of the ARBCA for schistocyte counts before and after reclassification of the cells with the reference manual microscopic method according to the ICSH criteria.28 They reported a very high sensitivity but rather poor specificity, requiring frequent reclassification by the operator. About two-thirds of RBCs initially classified as schistocytes by the software had to be reclassified to other categories (false positive cells). About one-fourth of all RBCs ultimately classified as schistocytes by the operator were reassigned from other categories (false-negative cells). A high variability between observers demonstrated that the main challenge in schistocyte analysis remains the morphological definition of a schistocyte.
Egele et al compared the sensitivity and specificity of pre- and postclassification of teardrop cells by the ARBCA in 10 normal blood smears and in 46 blood smears from patients in which teardrop cells were present. They found an increased sensitivity of the ARBCA for teardrop cells compared to manual microscopy leading to a 20%-30% overestimation of teardrop cells 29.
Egele et al further evaluated the ARBCA on 16 peripheral blood smears in which schistocytes were present, as well as on 10 normal slides. Preclassification of the normal blood smears showed fewer than 1.4% schistocytes; for patients with diseases associated with schistocytes, 15 out of 16 had schistocyte counts of 3.2% or higher.21, 30
It should be noted that direct comparisons of the results of different studies of the ARBCA are difficult because the RBC cutoff values can be adjusted by the user.
RBC overlap is sometimes falsely reported by the CellaVision as macrocytosis5, 6, 16 A low sensitivity of 33% for RBC agglutination has been reported by Horn et al.31 They also reported high specificities but variable sensitivities for RBC morphologic abnormalities using the DM96. Sensitivity for sickle cells and stomatocytes reached 100%.
Criel et al optimized the cutoff points between negativity and positivity of the software.32 This allowed them to achieve sensitivities exceeding 80% for “critical” RBC categories (target cells, tear drop cells, spherocytes, sickle cells, and parasites), while specificities exceeded 80% for the other RBC morphological categories. Results of within-run, between-run, and between-observer variabilities were all clinically acceptable. However, evaluation of the images by an experienced observer remained necessary.
Florin et al33 evaluated the ARBCA for screening and follow-up of malaria. They found that the sensitivity of the application was too low to be used as a screening method. However, the application may have a role in the follow-up of parasitemia.
There are case reports of the detection of cellular inclusions such as bacteria34 and ganciclovir induced Howell-Jolly body-like inclusions by the DM96.35
Racsa et al36 analyzed 124 Plasmodium and 6 Babesia blood smears retrospectively. The detection rate of parasitemia by CellaVision was 63% at parasitemia levels <0.1%; 87% at parasitemia levels between 0.1% and <1.0%; 94% at parasitemia levels between 1.0% and 2.49%; and 100% at parasitemia levels between 2.5% and >10%. This indicated that the detection rate by CellaVision increased with higher parasitemia. The authors noted that the CellaVision is not approved by the US FDA for the detection of intracellular parasites.
2.1.3 Platelet Clumps
Gulati et al37 collected 102 blood samples from 79 patients, whose ages ranged from 0 days (newborn) to 82 years. They reported a failure of the CellaVision to detect platelet clumps. They found that the sensitivity of detection of platelet clumps varied between 40.4% and 82.8% depending on whether the user looked for clumps or fibrin strands in the WBC display only, or searched in all of the WBC and PLT screens.
Similar findings were reported by Billard et al in 521 newborn and infant samples.16 This is chiefly due to the occurrence of platelet clumps outside the area screened by the CellaVision, for example, in the feather and lateral edges of the smear.
Gao et al38 compared 127 peripheral blood smears with manual microscopy, an automated hematology analyzer, and with the CellaVision DM96. The overall performance of the DM96 system for platelet counts was similar to both automated hematology analyzers and to manual platelet estimates. Bland-Altman plots showed no systematic bias.
2.1.4 Body Fluids
Riedl et al39 analyzed 177 body fluids (cerebrospinal fluid [CSF] and abdominal fluid as well as continuous ambulant peritoneal dialysis fluid) with the CellaVision and compared the results to manual microscopy. Preclassification accuracy was 90% for CSF and 83% for other fluids. Correlation coefficients for postclassification ranged from 0.92 to 0.99 for CSF and from 0.83 to 0.98 for other fluids, indicating that the CellaVision could be used in clinical practice for body fluid analysis. This method is approved by the US Food and Drug Administration (FDA).
2.1.5 Pediatric Samples
Billard et al16 reported that the detection of lymphoblasts showed low sensitivity and specificity in pediatric samples and that newborn blood films showed the lowest rates of overall accuracy. The authors suggest that the high misclassification rate may be the result of an increased cell fragility in young children. Indeed, the spreading of blood with a high hematocrit generates an increased number of damaged cells.16 Therefore, samples from newborns and from patients with leukemia may require manual review. An advantage of the CellaVision was time saved when scanning WBCs for the diagnosis of storage diseases. In contrast, Rollins-Raval et al17 found that blast identification in the pediatric population had excellent positive predictive value and sensitivity (94.7% and 90.5%).
2.1.6 Speed of review
The data in the literature on time savings achieved with the CellaVision are variable. The speed at which slides are reviewed by the CellaVision DM96 (approximately 30 slides per hour) has been estimated to be 10% to 25% faster than a manual differential, even after manual reclassification.40 The exact speed depends on the pathology of the patient, if any, and the skills of the user.4, 6 The review speed may be expected to be slower in busy laboratories with large numbers of samples that have low WBC counts as well as abnormal cells.
2.1.7 User survey
VanVranken et al41 used a questionnaire to determine perceptions of 81 CellaVision users on a 5-point Likert scale. They found that of the 37 (46%) responders, 35.7% “strongly agreed” or “agreed,” with the statement that “Digital imaging instruments are difficult to use with neonatal samples,” whereas 42.9% “strongly disagreed” or “disagreed.”
The benefits of using the CellaVision rated the strongest by 37 respondents (46%) included decreased eyestrain, consistency among patient results, and advantages in training personnel. Respondents reported notable limitations of the instrument, including difficulties with accurately estimating platelet counts and RBC morphology. A small separate survey of 8 nonusers of the CellaVision found that the factors impacting their laboratory's decision to continue manual microscopy included cost, availability of the CellaVision, difficulty of transition, low volume of patients, administrative problems, and concerns about turnaround times.
2.2 The DI-60 Integrated Slide Processing System (Sysmex, Kobe, Japan)
The DI-60 system was the first fully integrated cell image analyzer on the market. It consists of two Sysmex XN hematology analyzers, the Sysmex slide making/staining device SP-10, and a CellaVision instrument. Thus, a single sample placed on this automated device can provide a complete blood count (CBC) result, perform a blood smear preparation, and also determine cell location and identification.42
The DI-60 system utilizes identical software and modified hardware (camera lens system) as the CellaVision DM1200 analyzer. For quality control of the identification of cell location, cell location slides (recently drawn and freshly stained blood samples with normal WBC counts) are analyzed at regular intervals and after changes in staining procedures or staining solutions.42 The performance of the DI-60 system has been comprehensively evaluated in recent studies.42, 43 WBC classification by the DI-60 was acceptable both in normal and abnormal samples with improvements after user verification. However, RBC grading, on the basis of ICSH criteria, showed notable differences between the DI-60 and manual counting, suggesting a necessity for new RBC grading criteria for digital cell morphology analyzers.42, 44 Regarding RBC inclusions, a recent study by Park et al44 showed that the DI-60 may cause a false-negative result due to misclassified malarial parasites not only with low parasitemias (<1%) but even with parasitemia levels as high as 6.3%. This study underscored the fact that manual slide review may still be mandatory for malaria screening and that the DI-60 can only be used as a supplementary tool for the diagnosis of malaria.
Although platelet counting estimation by the DI-60 showed very high correlation with the results obtained with the Sysmex XN, the DI-60 underestimated platelet counts in samples with marked thrombocytosis.42
The performance of automated digital cell imaging analyzers depends highly on the quality of blood smears and/or the stain used for accurate identification of cells. A recent study compared the applicability of different staining methods to the DI-60: RAL Wright-Giemsa stain (RAL, RAL Diagnostics, Bordeaux, France), YD Wright-Giemsa stain (YD, YD Diagnostics, Yongin, Korea), and modified Leishman Giemsa stain (MLG, Merck).44 Overall, the results obtained with the DI-60 using these staining methods showed acceptable agreement and correlation with the manual counts; however, each staining method varied in terms of details of analytical performance and possible impact on laboratory efficiency. As mentioned, CellaVision provides a SmearChecker45 that assesses monolayer quality, stain intensity, and smear color. In the era of digital imaging analysis, further adjustment and optimization of conventional staining methods would be necessary to be applied to these new platforms in accordance with each laboratory's situation.
2.2.1 Performance Summary of the CellaVision System
Many studies have compared the WBC differential counts on the CellaVision to manual differentials by the standard reference method.46 However, the imprecision of the manual differential count increases with cells of low incidence, for example, basophils.
Preclassification by the CellaVision of RBC morphological abnormalities was compared to the gold standard of manual microscopy, but variation in manual microscopy skills between scientists, together with a lack of precise definitions of morphological abnormalities, affected reclassification outcomes.
Advantages of the use of the CellaVision compared to use of the microscope include the ability of remote reviews and reduced eyestrain and fatigue. The simultaneous display of all cells of a class is valuable in assisting the operator in the task of correctly identifying a specific cell type.15, 16 Another advantage of digital morphology analyzers is that cells are easily traceable and can be readily retrieved for review or if a second opinion is needed.4, 6 In addition, education and assessment of technical staff and residents can be standardized, and quality assurance (QA) and proficiency testing of WBC differentials and RBC reviews can be performed easily.
Overall, the CellaVision platform has been found to be at least satisfactory or acceptable for normal WBC differential counts and for the preclassification of RBC morphology. The literature is somewhat inconclusive on the ability of the CellaVision to correctly identify abnormal WBC, especially blasts, as well as plasma cells and immature granulocytes. Reduced immature granulocyte classification accuracy may be due in part to the subjectivity in classification of these cell types, but differences identified by the technologist were generally within one stage of maturation.6, 17 Most data in the literature indicate that the use of the CellaVision can lead to faster reviews, especially in settings with a low incidence of abnormal cells. However, it is not clear if the CellaVision can significantly decrease turnaround times in settings with patient populations with a high incidence of low WBC counts and abnormal cells.
Due to these limitations, samples from neonates and patients suspected to have leukemia or other diseases that can cause the presence of pathological WBC and RBC may require review by manual microscopy. The same applies for suspected dysplastic cells, schistocytes, intracellular infections, RBC agglutinates, and platelet clumps. Therefore, depending on the patient population, 10 to 20% of all slides may still require manual microscopic review. It should, however, be noted that the morphology of these pathological cells (eg, dysplastic cells) can still be assessed using the images generated by the digital microscope system, thereby reducing the rate of additional unnecessary manual assessment. Future improvement of digital imaging modules will most certainly improve detection and classification of specific pathological cell types (eg, dysplastic cells and intracellular parasites).
2.3 Vision Hema (West Medica, Perchtoldsdorf, Austria)
According to promotional material distributed by Vision Hema, the Vision Hema Automated Hematology Imaging Analyzer offers automatic analysis and preclassification of blood cells. The system provides detailed analysis of leukocytes, RBC, platelets, and reticulocytes. The Vision System Ultimate analyzer can load up to 200 slides at once. Other systems are available that can load one, 4, and 8 slides at once. A separate module allows automation of both thick and thin smears for malaria diagnosis. The company also offers modules for identification and preclassification of cells in bone marrow samples, body fluids, and cervical cytology. Unfortunately, no peer-reviewed literature is available on this instrument.
2.4 EasyCell Assistant (Medica Corporation, Bedford, MA, USA)
The company manufactures Hematology Imaging Systems called EasyCell Assistant, distributed by Medica Corporation, Sysmex, and Horiba Medical. The instruments can locate 100 or 200 leukocytes on a blood smear and preclassifies them grouped by cell type. The analyzer also provides images of red cells and the performance of a platelet estimate. Unfortunately, no peer-reviewed literature is available on this instrument.
2.5 Nextslide (Nextslide Imaging, LLC, Cleveland, OH, USA)
Yu and colleagues evaluated the Nextslide Digital Review Network, an automated digital imaging system.47 The system includes applications for image processing, management, and online review; it scans images of red cells, white cells, platelets, and immature cells at x100 magnification. The instrument has a throughput of 26 slides per hour. Using a total of 479 blood samples, comparisons were performed to manual microscopy and to the CellaVision DM96. Samples included “normal” smears as well as a wide variety of clinical conditions. Overall, there was good agreement between manual microscopy and the Nextslide. Similar correlation was observed between the Nextslide and the CellaVision: When compared with the CellaVision, the Nextslide provided basically the same results as the CellaVision for neutrophils, lymphocytes, monocytes, eosinophils, blasts, and immature granulocytes. Correlation coefficients for basophils, atypical lymphocytes, and bands were lower. Average review times on the Nextslide were 2.03 minutes, which compared very favorably with the average of 10.9 minutes for manual counts. The system can handle manually prepared slides, and does not require an automated slidemaker.
At the time of writing this paper, the vendor could not be reached for comments.
2.6 COBAS 511 (Roche Diagnostics, Indianapolis, IN, USA)
A new prototype instrument, the Roche Cobas m511, has recently been described by Winkelman and colleagues.48 The platform introduces a novel method of “painting” or “printing” a known volume of undiluted, anticoagulated blood cells as a monolayer on a slide. After staining, automated slide-based technology and multispectral imaging are used for visualization, counting, and characterization of blood cells, to obtain cell counts and indices, and to derive a 5-part differential on 600 nucleated cells on the slide. In the high-magnification module, a dry 50X lens is used to classify WBC types, evaluate RBC and platelets, and to review cellular morphology. The authors note that “the optical systems are designed to provide high-quality images without the use of oil immersion.” The instrument can simultaneously provide both quantitative and morphologic information. It also eliminates the needs for separate slidemakers, stainers, and lasers, thereby reducing footprint.
Images of preclassified cells are displayed on a computer screen for reclassification, similar to the CellaVision. Correlation studies using 1263 samples comparing a prototype m511 instrument with a Sysmex XE showed excellent correlations.
Bruegel and co-workers reported the findings of a four-site multicenter study of the m511. They found good repeatability and reproducibility.49 The agreement with manual microscopy was around 95%. Correlations with the Sysmex XN system were very good, with a Pearson's R >0.95% for most CBC parameters. Extensive studies on sensitivity and specificity of various pathological cell types (eg, blast cells) have yet to be performed.
2.7 HemaCAM (Fraunhofer Institute for Integrated Circuits IIS)
The HemaCAM system was developed for use in hematology laboratories for differential blood cell counts. It consists of a high-power microscope with a motorized stage that can be vertically and laterally moved by a computer. Cells are presented for review; a drag-and-drop function allows easy reclassification of cells. Information on RBC, WBC, platelets, and comments for individual blood samples can be added manually. Future applications under development may include detection of malaria parasites and automated recognition of bone marrow cells for the workup of leukemia. HemaCAM is certified in Europe as an in vitro diagnostic device in accordance with the Medical Devices Act in the European Union.50
2.7.1 Education and research
As pointed out by many authors including VanVranken et al,41 Hsu et al,7 Horiuchi et al,51 important advantages of digital morphology include the ability to easily consult colleagues, reference abnormal cells, utilize archived images for education, quality assurance and competency assessment, archival, retrieval, and expert consultation from remote sites.
Horiuchi et al51 reported that the CellaVision competency software (CCS) proficiency testing program was useful for the quality assessment of laboratory performance, for education, and for the storage and distribution of cell images to be utilized for further standardization and education.
As demonstrated by Marionneaux et al, Digital Imaging can also make it possible to perform scientific studies on archived digital images.52 The ease of saving digital images together with other patient information is another advantageous feature according to Riedl et al.39
Marionneaux et al highlighted the potential use of the DM96 as a rapid tool for the screening of atypical chronic lymphocytic leukemia (CLL) variants.53 Images of CLL lymphocytes from a DM96 were reclassified into lymphocyte morphological subtypes to facilitate the detection of typical or atypical CLL variants.
Alférez et al have used the DM96 platform for the automatic classification of atypical lymphoid B cells.40, 54 They were able to separate five different kinds of lymphoid cells based on DM96 images,36 reinforcing the possibility that digital imaging alone may allow more precise subclassification of WBCs.
Similar findings were obtained by the group of Herishanu et al,55 who demonstrated that the morphological analysis provided by the DM1200 may be a rapid and reliable tool to augment the performance of clinical laboratories in the diagnosis of CLL. Herishanu et al also investigated absolute monocyte counts in CLL patients and found a good correlation between the CellaVision DM1200 system and manual counts.56
Lippi et al used the DM96 to investigate spurious hemolysis in whole blood samples.57 They confirmed that RBC ghosts may be unsuitable to screen for spurious hemolysis. They also concluded that the DM96 could perhaps be used to detect hemolysis.
Mahe et al found good correlation in the reticulocyte count between the CellaVision DM96, the Beckman Coulter LH750 automated analyzer, and the manual reticulocyte count.58
3 RECOMMENDATIONS
On the basis of the review of the studies published in the scientific literature, the Working Panel on Digital Imaging now recommends the following specifications for routine use and experimental evaluation of systems based on automated digital evaluation of stained peripheral blood smears.
3.1 Appropriate use of digital imaging in conjunction with automated cell counters
The latest generations of automated cell counters perform very well in the identification of normal cells and samples with quantitative abnormalities. Evaluation of imprecision and inaccuracy versus the manual reference method indicates good performance for neutrophils, lymphocytes, and eosinophils (r = 0.974-0.888), less so for monocytes (r = 0.757-0.490), and poor for basophils (r = 0.532-0.078).59 Many instruments also have dedicated channels that allow the flagging of qualitative cellular abnormalities, with a very low incidence of false-negative results. Morphological flags from these instruments show variable levels of sensitivity and specificity.60 This can necessitate further studies such as the evaluation of cell counts, distribution curves, cytograms and flags, integrated with the patient's age and gender, delta checks, and clinical data, when available. Many laboratories have developed their own computer algorithms and rules to determine which samples should be smeared and stained, or use modifications of published recommendations, for example, those of the International Consensus Group for Hematology Review.61 A morphological review step can then often be performed quickly with digital morphology. The operator can validate the differential on digital images, and, if needed, also perform a manual microscopic review as a final check.
- Selection of samples to be digitized should be based on rules that use the flagging systems of the institution's automated analyzer as well as automated rules based on the specific needs of the laboratory.
- It is possible that abnormal cells which trigger flags when thousands of signals are processed by an automated blood cell counter will not be represented within the limited number of cells displayed as digital images. For example, a distribution of pathological cells limited to the margins of a slide can lead to their absence in the digital images. We therefore recommend to always reconcile the flags of the automated analyzer with the report of the digital imaging. In the presence of any discrepancy, a manual differential with the microscope should be performed.
- There may be substantial differences between digital systems that use slides that were prepared and stained manually versus systems that use automated slidemakers and stainers. Although in the latter case smears and stains are standardized, correlation studies on differences in morphological details and color matching between the two methods are often lacking. In automatically prepared films, cells may appear larger and thinner, and there may be chromatic differences compared to cells stained with panoptical stains.44, 62, 63 In the presence of abnormal cells or in pediatric samples of lymphocytes, the size, thickness, and color differences can lead to incorrect cell identification, often resulting in an overestimation of blast cells.18
We strongly recommend that operators should be trained on this “new morphology” just as they are trained in the interpretation of cytograms from automated cell counters. The available compendia of digitized cell images are not sufficiently comprehensive to eliminate mistakes in cell classification. It is therefore incumbent upon the instrument manufacturers to provide as many abnormal cell images as possible, stained and prepared with both manual and automated methods.
Morphological correlation studies should be performed to achieve a correct interpretation and utilization of the new automated digitized morphology.
The advantages of the CellaVision systems over manual microscopy are listed in Table 1.
Enable remote review (but approximately 10%-20% of slides may still need manual review) |
Digital archiving, easy retrieval of blood films. |
Reduced fatigue, eyestrain. |
Can reduce labor costs (but skilled labor is still essential). |
Facilitation of morphology education, QA, training, and peer review in digital format. |
Situations in which the CellaVision has limitations and a manual microscopic review may be required are listed in Table 2. The recommendations are primarily based on the scientific experience published for the CellaVision.
All samples from newborns and from patients with leukemia. |
Suspicion of the presence of pathological cell types, including blasts, plasma cells, and immature granulocytes. |
Suspected dysplastic cells. |
Suspected schistocytes. |
Screening for intracellular parasites (eg, Malaria, Babesia, other parasites). |
RBC agglutinates. |
Platelet clumps (due to location in feather and lateral edges). |
3.2 Standardization
As much of the literature on standardization in digital imaging refers to whole slide imaging, they are applicable to digital analyzers only in general terms and not as specific recommendations for digital analyzers.
The need for standardization in digital imaging was highlighted by Yagi and Gilbertson as early as 2005 64. They noted multiple obstacles to standardization; these included significant differences between laboratories in tissue processing and staining as well as the hardware and software used to identify and classify different cell types.64 65 Since the publication of their paper, Digital Imaging has developed and expanded tremendously; however, the goal of standardization of the field is still far removed.
Several professional organizations have published documents dealing with various aspects of standardization for digital imaging.
- The College of American Pathologists has published a guideline on validating whole slide imaging for pathology 65
- The FDA and the ICC (International Color Consortium) co-sponsored a summit on color in medical imaging.66
- DICOM (Digital Imaging and Communication in Medicine) has published a supplement to its standard introducing Image Information Object Definition (IOD) and Service-Object Pair (SOP) classes for whole slide imaging (WSI).67 Although this document is mainly focused on WSI (whole slide imaging) in anatomical pathology, it is presently probably the most important document on standardization in digital hematology.
3.3 Practical recommendations
- Digital morphology analyzers should use high optical magnification, if possible x100 (oil).
- In the paper of Yagi and Gilbertson calling for standardization of digital image analysis, they outlined six recommendations which remain valid and should be implemented as soon as possible by all manufacturers of digital imaging instrumentations64:
- Systems from different manufacturers should be able to share image files.
- Standards should allow the transmission of information on baseline colors and of recommended display parameters.
- The images should be useful to the pathologist, not necessarily better or worse than direct examination of a slide under the microscope.
- A mechanism to evaluate image quality objectively should be present.
- A mechanism to adjust and correct minor errors of tissue processing should be developed; in the context of digital analyzers, “tissue processing” applies to the quality of the smear and stain, which has been discussed in an earlier section.
- A public organization should support pathologists in the development of standards.
- Manufacturers should concentrate their efforts on the following:
- Improve accuracy of WBC and RBC preclassification.
- Improve automated recognition and classification of pathological cell types.
- Standardize RBC grading according to consensus guidelines. Palmer et al have adjusted default cutoffs for grading of morphological abnormalities depending on clinical significance.68
- Integration of cell counter results with digital imaging.
3.4 Recommendations for validation and verification studies for hematology laboratories
- Hematology laboratories that intend to use digital imaging should perform comparison studies between their predicate methods (in most cases, this will be manual microcopy) and the digital method. At least 50 slides covering all cell types, including abnormal cells, should be evaluated. Such studies are especially important if the laboratory is switching from manual slide preparation and staining to automated methods.
- Laboratories which have switched from manual to automated methods should notify users of the change of method. If possible, high-quality pictures of cells prepared and stained with manual methods and with automated methods should be shared by manufacturers with users, with explanations of the changes users should expect. This is especially important in settings where external quality assessment (EQA) samples sent by local regulatory agencies are prepared with different methods than used in the laboratory.
- Laboratories should focus their blood film reviews on the clinically most important data, thereby reducing the workload of the morphologists; this can, for example, be achieved by concentrating on “critical” abnormalities. In 2016, Criel et al32 proposed “critical” categories for RBC including sickle cells, schistocytes, teardrops, spherocytes, and parasites.32 A similar approach was taken by Nakashima and colleagues.69
4 SUMMARY
This monograph includes a broad synopsis of the peer-reviewed literature on digital image analysis in clinical hematology laboratories, a comprehensive summary of the strengths and weaknesses of the CellaVision platform and of other similar instruments on the market, based on information available at the time of writing. The increasing number of digital imaging instruments makes standardization and comparison of instruments a high priority.
As CellaVision was among the first companies to offer digital image analysis in laboratory hematology and is the market leader, most studies in the field evaluate the CellaVision. We have done our best to cite and discuss as many peer-reviewed papers as possible. Due to the availability of the CellaVision technology for at least fifteen years, there are a large number of papers in the literature describing specific features, abilities, and limitations of CellaVision instrumentation. We are looking forward to seeing more papers dedicated to specific features of instruments made by CellaVision or by other manufacturers.
Deficiencies of our analysis may include variability in the performance of Digital Morphology Hematology Analyzers due to differences in the patient population from which samples were derived, of the instrument settings used, and of the skills of the morphologists in reclassification of cell images. Furthermore, detailed specifications of some of the newer instruments were not available from some vendors. Many published studies lacked disclosure statements; it was therefore uncertain to what extent they were sponsored by vendors.
In summary, our data document the rapid development of digital morphology analyzers in laboratory hematology; at the same time, they draw attention to the many opportunities for improvements in this field.
CONFLICT OF INTEREST
The authors have no competing interests.
AUTHOR CONTRIBUTIONS
AK designed and supervised the study, reviewed the literature, and wrote the paper; S-HL reviewed the literature and provided advice; GZ, JR, and MH contributed significant content to specific parts of the manuscript; SM provided directions and advice; and all authors have critically read the manuscript.