BinClass

test

An easy-to-use Elixir library for building, training, and deploying binary text classifiers with Axon.

This library provides a simplified interface for training a neural network on text data and using it for predictions, handling tokenization, vectorization, and model training out of the box.

Installation

The package can be installed by adding bin_class to your list of dependencies in mix.exs:

def deps do
  [
    {:bin_class, "~> 0.1.0"}
  ]
end

Quick Start

1. Prepare your data

Data should be an enumerable of maps containing :text and :label (0 or 1).

data = [
  %{text: "This is a great product!", label: 1},
  %{text: "I really hated this experience.", label: 0},
  # ... more samples
]

2. Train the model

# Labels can be a list (index 0 and 1) or an explicit map
classifier = BinClass.Trainer.train(data,
  epochs: 10,
  labels: %{0 => :negative, 1 => :positive}
)

# Optional: Enable auto-tuning to find the best Learning Rate and Dropout
classifier = BinClass.Trainer.train(data, tune: true)

3. Save and Load

You can save the entire model (including tokenizer, parameters, and metadata) to a single file.

BinClass.save(classifier, "my_model.bin")

# Load the model as an Nx.Serving struct (recommended for most apps)
serving = BinClass.load("my_model.bin")

4. Optimized Inference

There are two ways to run predictions:

A. Using Nx.Serving (High Throughput)

Recommended for web servers and concurrent applications. It handles automatic batching.

prediction = Nx.Serving.run(serving, "I love this library!")

B. Using a Compiled Predictor (Ultra-Low Latency)

Recommended for CLI tools or scripts where you want the lowest possible latency for single items by bypassing the serving overhead.

classifier = BinClass.load_classifier("my_model.bin")
predict = BinClass.compile_predictor(classifier)

result = predict.("This is ultra fast.")

Examples

Check out the examples/ directory for scripts demonstrating various use cases:

Features