README.md

# Extensor

Extensor provides [Tensorflow](https://tensorflow.org) bindings for inference
in [Elixir](https://elixir-lang.org/). This library can be used to execute
computation graphs created in Tensorflow on the CPU or GPU. Extensor
provides minimal abstractions over the Tensorflow C library and includes
as little custom native code as possible. These NIFs have been extensively
tested for memory leaks and paralellism safety so that the library can be
relied on for production use.

## Status
[![Hex](http://img.shields.io/hexpm/v/extensor.svg?style=flat)](https://hex.pm/packages/extensor)
[![Test](http://circleci-badges-max.herokuapp.com/img/pylon/extensor?token=:circle-ci-token)](https://circleci.com/gh/pylon/extensor)
[![Coverage](https://coveralls.io/repos/github/pylon/extensor/badge.svg)](https://coveralls.io/github/pylon/extensor)

The API reference is available [here](https://hexdocs.pm/extensor/).

## Installation

### Hex
```elixir
def deps do
  [
    {:extensor, "~> 0.1"}
  ]
end
```

### Dependencies
This project requires the Tensorflow C headers/libraries. For development,
these can be installed from the steps [here](https://www.tensorflow.org/install/install_c).

For docker deployment, see the sample dockerfiles in the docker directory.
Docker for ubuntu can be tested with the following commands.

```bash
docker build -t extensor -f docker/ubuntu.dockerfile .
docker run --rm -it extensor mix test
```

If you have nvidia tools installed, you can test on the GPU by substituting
`nvidia-docker` for `docker` above.

## Usage
For a simple example, Extensor can be used to evaluate the Pythagorean
identity (c² = a² + b²). The following python [script](
https://github.com/pylon/extensor/tree/master/test/pythagoras.py) can be used
to create and save a graph that calculates the length of the hypotenuse.

```python
import tensorflow as tf

a = tf.placeholder(tf.float32, name='a')
b = tf.placeholder(tf.float32, name='b')
c = tf.sqrt(tf.add(tf.square(a), tf.square(b)), name='c')

with tf.Session() as session:
    tf.train.write_graph(session.graph_def,
                         'test/data',
                         'pythagoras.pb',
                         as_text=False)
```

This model can then be used in Elixir to evaluate the compute graph and
calculate a few hypotenuses. This model file is also available in the repo
under [test/data/pythagoras.pb](
https://github.com/pylon/extensor/tree/master/test/data/pythagoras.pb).

```elixir
session = Extensor.Session.load_frozen_graph!("test/data/pythagoras.pb")

input = %{
  "a" => Extensor.Tensor.from_list([3, 5]),
  "b" => Extensor.Tensor.from_list([4, 12])
}

output = Extensor.Session.run!(session, input, ["c"])

Extensor.Tensor.to_list(output["c"])
```

This block should output the list [5.0, 13.0], which corresponds to the
lengths of the hypotenuses of the first two Pythagorean triples.

### Model Formats
Extensor supports the frozen [graph_def](
https://www.tensorflow.org/extend/tool_developers/#graphdef) and [saved_model](
https://www.tensorflow.org/programmers_guide/saved_model) serialization
formats, via the `load_frozen_graph` and `load_saved_model` functions,
respectively.

For example, the Google [Inception](https://github.com/google/inception)
[model](http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz)
has its weights frozen to const tensors, so that it can be loaded directly
from a protobuf.

However, the frozen graph approach doesn't work for models that contain
non-freezable variables (like RNNs). For these models, Extensor supports the
Tensorflow saved_model format, which is the format used by Tensorflow serving
(TFS). The saved_model format is loaded from a directory path, which includes
model metadata and initial variable weights.

### Configuration
Extensor supports passing a [ConfigProto](
https://www.tensorflow.org/versions/r1.0/api_docs/python/tf/ConfigProto)
object when creating a session for inference configuration. See the
Tensorflow.ConfigProto module for more information on the configuration
data structures.

```elixir
config = %{
    Tensorflow.ConfigProto.new()
    | gpu_options: %{
        Tensorflow.GPUOptions.new()
        | allow_growth: true
    }
}

session = Extensor.Session.load_saved_model!("test/data/pythagoras", config)
```

## Development
The Tensorflow protocol buffer wrappers were generated using the
[protobuf-elixir](https://github.com/tony612/protobuf-elixir) library
using the following command, assuming Tensorflow is cloned in the
../tensorflow directory:

```bash
protoc --elixir_out=lib --proto_path=../tensorflow $(ls -1 ../tensorflow/tensorflow/core/framework/*.proto ../tensorflow/tensorflow/core/protobuf/*.proto)
```

## License

Copyright 2018 Pylon, Inc.

  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

      http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License.