Hung, Jane Goodman, Allen Ravel, Deepali Lopes, Stefanie C. P. Rangel, Gabriel W. Nery, Odailton A. Malleret, Benoit Nosten, Francois Lacerda, Marcus V. G. Ferreira, Marcelo U. RĂ©nia, Laurent Duraisingh, Manoj T. Costa, Fabio T. M. Marti, Matthias Carpenter, Anne E. Additional file 1 of Keras R-CNN: library for cell detection in biological images using deep neural networks Additional file 1: Figure S1. Comparison of mean average precision curves for different IoU thresholds for Keras R-CNN versus CellProfiler on the nuclei and malaria datasets. For nuclei, the mean average precision is 0.99 at a threshold of 0.5 for Keras R-CNN. For malaria, the mean average precision is 0.78 at a threshold of 0.5 for Keras R-CNN. Figure S2. Overview of P. vivax data and results. The samples contain two classes of uninfected cells (red blood cells and leukocytes) and four classes of infected cells (gametocytes, rings, trophozoites, and schizonts) and have a heavy imbalance: more than 95% of all cells are uninfected, roughly the distribution in patient blood. A. Depiction of all relevant cell types found in human blood, including two types of uninfected cells and 4 types of infected cells in the P. vivax life cycle. The cycle on the left shows asexual development. Gametocytes come from sexual development and lead to transmission. B. Confusion matrix comparing annotations of two experts (colors normalized so that rows sum to 1); the significant signal off-diagonal speaks to the challenge for experts to agree upon the proper stage label for each cell. Experts were asked to identify relevant cells and label them as one of the cell types or difficult. C. Example of malaria-infected blood smear results. Red boxes are ground truth; blue boxes are predictions produced by Keras R-CNN. Table S1. Malaria datasets. Figure S3. Results for P. falciparum. Figure S4. Visualization of learned features and single-cell data. Diffusion pseudotime plots made from deep learning features with accompanying ground truth class information. The first row has plots of the first two diffusion coordinates and the next row has plots of the second and third diffusion coordinates. Note: the model used to generate these plots is slightly different than the final one run in the paper. Figure S5. Visualization of learned features and single-cell data. t-SNE plot made from deep learning features colored by ground truth class information. Note: the model used to generate these plots is slightly different than the final one run in the paper. Figure S6. Inference time comparison across common object detection methods. Deep learning;Keras;Convolutional networks;Malaria;Object detection 2020-07-12
    https://springernature.figshare.com/articles/journal_contribution/Additional_file_1_of_Keras_R-CNN_library_for_cell_detection_in_biological_images_using_deep_neural_networks/12644686
10.6084/m9.figshare.12644686.v1