Neural network in your pocket

One of the most interesting announcements at WWDC has flown mostly flown under the radar: an extension to the already impressive Accelerate framework called BNNS. Accelerate, if you don’t know, is a deeply (often hand) optimized low-level library for computation that’s set up to leverage the specifics of Apple platform. It’s the “go-to” library for numerical computations, linear algebra, matrix manipulations, etc for speed and efficiency.

BNNS stands for “Basic Neural Network Subroutines” and includes optimized routines for the key pieces of setting up and using convolutional neural networks. The intent of this library is to provide efficient libraries for putting together enough of a neural network to classify or recognize. In particular, it doesn’t include the functions that you would use for training a neural network, although I’m sure its possible to manually add the code to do exactly that.

When I heard about BNNS and watched WWDC Session 715, I immediately wondered what it would take to start with something in KerasTensorflow or Theano: do the training with those libraries having constructed the network, and then port the network and trained values into a neural network constructed with BNNS routines. I suspect there’s a pretty reasonably straightforward way to do it, but I haven’t dug it out yet.

Fortunately, recognition or inference is relatively “cheap” computationally – at least compared to training of networks. So I expect there is a good potential to train externally, load as a data feed into a IOS application, and then recognize it. If you find examples of anyone doing and showing that, I would appreciate a comment about it here on the blog. Considering that training a large, diverse network can eat of the equivalent of a couple of racks of high-end-gaming-machine-quality GPU systems, that’s not something I’d look at trying to do on an IOS device.

While I think I understand most of the common pieces of building neural networks, I can’t really claim it yet as I haven’t tried to teach it to someone else, or yet used it “in anger” – when I needed to get something specific done NOW. Either of those is what usually has me hitting those subtle roadblocks and potholes that lead to “Ah-ha!” moments of deep understanding. This may be a good side project: to pick a couple of easier targets and see where I can get to. Maybe grab the MNIST digit recognition or something and see what I can train with Keras or Tensorflow, and then figure out how to translate and port that into classifiers for IOS using BNNS.

 

Published by heckj

Developer, author, and life-long student. Writes online at https://rhonabwy.com/.

%d bloggers like this: