Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 1.87 KB

README.md

File metadata and controls

7 lines (4 loc) · 1.87 KB

Recognized-TBL-by-CNN

To ensure other researchers with no coding experiences can implement our neural network, the analysis was based on the open-source Deep Learning library implementation in The Konstanz Information Miner (KNIME) Analytics Platform [1]. Preparing images is done by resizing each one to be compatible with each pre-trained CNNs image input layer. Each of the previous pre-trained networks has its input layer size: As mentioned earlier, a patch size of 30 was selected, and the images were resized to 224 × 224 for VGG16, VGG19, and DenseNet121, as well as to 150 × 150 for InceptionV3, Xception. After the preprocessing step, each cropped patch was fed into a convolutional neural network to extract the image features for classification.We upload the workflow file which could be replicate image preprocess program and recognition programs on KNIME 4.0.0 to Github (Take VGG16 as an example, the reader could easily extend to other algorithms). In order to run the workflow, the following KNIME extensions was installed, including 1) KNIME Deep Learning - Keras Integration (Labs); 2) KNIME Image Processing (Community Contributions Trusted); 3) KNIME Image Processing - Deep Learning Extension (Community Contributions Trusted); 4) KNIME Image Processing - Python Extension (Community Contributions Trusted); 5) KNIME Streaming Execution (Labs). The local python installation that includes Keras was installed followed https://www.knime.com/deeplearning#keras for installation recommendations and further information.

[1] M.R. Berthold, N. Cebron, F. Dill, T.R. Gabriel, T. Kötter, T. Meinl, P. Ohl, C. Sieb, K. Thiel, B. Wiswedel, KNIME: The Konstanz Information Miner, in: C. Preisach, H. Burkhardt, L. Schmidt-Thieme, R. Decker (Eds.), Data Analysis, Machine Learning and Applications, Springer, Berlin, Heidelberg, 2008: pp. 319–326. https://doi.org/10.1007/978-3-540-78246-9_38.