1. 程式人生 > >Clojure Package for MXNet

Clojure Package for MXNet

Clojure Package for MXNet

One of the strengths of MXNet is its multi-language support. With the shared backend written in C, you can train, use, and scale your deep learning models with the language you are comfortable in. As of the v1.3 release, the family of language bindings for MXNet has grown from Python, Scala, C++, Julia, Perl, and R to include Clojure.

The Clojure MXNet package opens up modern deep learning, and flexible and efficient GPU computing to Clojure users everywhere. To give you a taste of what is available, we’ll take a tour to highlight a few things you can do.

Image Recognition

You can load state-of-art pre-trained models in MXNet and quickly do predictions on images.

The function below gets the RESNET-152 pre-trained model and loads it into a module. It also loads the synset.txt labels for the classification categories. The function accepts an image url parameter and runs the image through the model for classification returning the top 5 probabilities with their labels.

Let’s see what happens when we give it the image of this tabby cat.

We can see that the top predication accurately chooses the tabby. For the full code, you can check out the pre-trained models example in the github repo.

You can also create your own models and train them with Module API. The Clojure package provides a clean way to compose your own layers. The following code constructs a network of fully connected and activation layers that can be used for training on MNIST handwritten digits.

You can explore the Clojure Module documentation to learn more about training and predicting.

Generative Models

Generative models are available for Clojure. There is full support for GANs (Generative Adversarial Networks). A great example of this is using the MNIST handwritten digits example. From a training set of digits, you can see the program gradually start to generate more and more realistic images as the training progresses.

Natural Language Processing

You can also explore using RNNs (Recurrent Neural Networks) to generate text from a corpus with this example. The program starts with no knowledge of language or language rules. It just trains on the text of a corpus of Obama’s speeches by taking one character and trying to predict what character comes next. Gradually, over the course of many epochs, it learns how to generate sentences. When given some starter text of “The joke” it produces something that is surprisingly good.

The joke before them prepared for five years ago, we only hear a chance to lose our efforts and they made striggling procedural deficit at the city between a politics in the efforts on the Edmund Pett

Wrap Up

The Clojure API for MXNet opens up exciting opportunities for the Clojure community to get involved with deep learning in the language you love. Dive in and get started today with the online and the project documentation.