Computing Reviews

Convolutional neural networks in APL
Šinkarovs A., Bernecky R., Scholz S.  ARRAY 2019 (Proceedings of the 6th ACM SIGPLAN International Workshop on Libraries, Languages and Compilers for Array Programming, Phoenix, AZ, Jun 22, 2019)69-79,2019.Type:Proceedings
Date Reviewed: 10/31/19

After a short introduction, the authors show how to implement a convolutional neural network (CNN) in APL, a programming language based on a multidimensional array, illustrated by an example CNN for handwriting recognition on the Modified National Institute of Standards and Technology (MNIST) dataset.

The authors define the CNN building blocks (the different CNN layers and the forward and backward passes) that are used to compute the whole CNN. They then provide an experimental evaluation that compares their APL programs to the popular TensorFlow framework and an implementation in Single Assignment C (SAC).

The paper is well structured and easy to follow. The authors give a thorough background for CNNs and APL, which facilitates reader understanding of the presented implementations. It seems that APL is a well-suited handy tool for experimenting with CNNs. Therefore, the paper will be especially interesting to researchers and practitioners who require flexible and easily adaptable implementations.

It would be interesting to see how other CNNs can be implemented using the defined CNN building blocks, to further support the claim that they are reusable. Also, a more detailed comparison with SAC would be helpful.

Some questions remain. Why is SAC not directly used to implement the CNNs? Which back end was used for TensorFlow, and why can implementations in SAC outperform TensorFlow, a highly popular framework by very experienced developers?

Reviewer:  Sergei Gorlatch Review #: CR146755 (2002-0033)

Reproduction in whole or in part without permission is prohibited.   Copyright 2024 ComputingReviews.com™
Terms of Use
| Privacy Policy