Springer Nature
Browse
12859_2020_3635_MOESM1_ESM.docx (4.82 MB)

Additional file 1 of Keras R-CNN: library for cell detection in biological images using deep neural networks

Download (4.82 MB)
journal contribution
posted on 2020-07-12, 04:17 authored by Jane Hung, Allen Goodman, Deepali Ravel, Stefanie C. P. Lopes, Gabriel W. Rangel, Odailton A. Nery, Benoit Malleret, Francois Nosten, Marcus V. G. Lacerda, Marcelo U. Ferreira, Laurent Rénia, Manoj T. Duraisingh, Fabio T. M. Costa, Matthias Marti, Anne E. Carpenter
Additional file 1: Figure S1. Comparison of mean average precision curves for different IoU thresholds for Keras R-CNN versus CellProfiler on the nuclei and malaria datasets. For nuclei, the mean average precision is 0.99 at a threshold of 0.5 for Keras R-CNN. For malaria, the mean average precision is 0.78 at a threshold of 0.5 for Keras R-CNN. Figure S2. Overview of P. vivax data and results. The samples contain two classes of uninfected cells (red blood cells and leukocytes) and four classes of infected cells (gametocytes, rings, trophozoites, and schizonts) and have a heavy imbalance: more than 95% of all cells are uninfected, roughly the distribution in patient blood. A. Depiction of all relevant cell types found in human blood, including two types of uninfected cells and 4 types of infected cells in the P. vivax life cycle. The cycle on the left shows asexual development. Gametocytes come from sexual development and lead to transmission. B. Confusion matrix comparing annotations of two experts (colors normalized so that rows sum to 1); the significant signal off-diagonal speaks to the challenge for experts to agree upon the proper stage label for each cell. Experts were asked to identify relevant cells and label them as one of the cell types or difficult. C. Example of malaria-infected blood smear results. Red boxes are ground truth; blue boxes are predictions produced by Keras R-CNN. Table S1. Malaria datasets. Figure S3. Results for P. falciparum. Figure S4. Visualization of learned features and single-cell data. Diffusion pseudotime plots made from deep learning features with accompanying ground truth class information. The first row has plots of the first two diffusion coordinates and the next row has plots of the second and third diffusion coordinates. Note: the model used to generate these plots is slightly different than the final one run in the paper. Figure S5. Visualization of learned features and single-cell data. t-SNE plot made from deep learning features colored by ground truth class information. Note: the model used to generate these plots is slightly different than the final one run in the paper. Figure S6. Inference time comparison across common object detection methods.

Funding

Foundation for the National Institutes of Health Burroughs Wellcome Fund Royal Society National Science Foundation Fundação de Amparo à Pesquisa do Estado de São Paulo Conselho Nacional de Desenvolvimento Científico e Tecnológico Howard Hughes Medical Institute

History

Usage metrics

    BMC Bioinformatics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC