DSPy for Elixir via SnakeBridge
Declarative LLM programming with full access to Stanford NLP's DSPy framework
Overview
DSPex brings DSPy — Stanford NLP's framework for programming language models — to Elixir. It ships with SnakeBridge-generated Dspy.* bindings that mirror DSPy's package layout (great for HexDocs + IDE navigation), plus a minimal DSPex convenience layer over SnakeBridge's Universal FFI. Use the generated modules for the full API surface or the thin FFI wrapper for direct calls.
DSPex 0.12.0 pins and generates bindings for DSPy 3.2.0.
Why DSPex?
- Two access layers — Generated
Dspy.*modules + a minimalDSPexFFI wrapper - Full DSPy access — Signatures, Predict, ChainOfThought, optimizers, and more
- Python-shaped docs — Module tree mirrors DSPy packages for clean HexDocs
- 100+ LLM providers — OpenAI, Anthropic, Google, Ollama, and anything LiteLLM supports
- Production-ready timeouts — Built-in profiles for ML inference workloads
- Elixir-native error handling —
{:ok, result}/{:error, reason}everywhere
Installation
Prerequisites (one-time):
- Python 3.9+
- uv for Python package setup:
curl -LsSf https://astral.sh/uv/install.sh | sh -
Optional (RLM example only): Deno runtime (external binary), install via asdf:
asdf plugin add deno https://github.com/asdf-community/asdf-deno.gitasdf install(uses the pinned version in.tool-versions)
Add DSPex to your mix.exs:
def deps do
[
{:dspex, "~> 0.12.0"}
]
end
Create config/runtime.exs for Python bridge configuration:
import Config
SnakeBridge.ConfigHelper.configure_snakepit!()Then install dependencies and set up Python:
mix deps.get
mix snakebridge.setup # Creates managed venv + installs dspy-ai automatically
SnakeBridge manages an isolated venv under priv/snakepit/python/venv; no manual venv creation or pip installs needed. This release installs DSPy 3.2.0.
For local development against a checkout of SnakeBridge, this repo uses a path dependency
(../snakebridge). If you want to use the Hex release instead, update mix.exs to depend on a
versioned {:snakebridge, "~> ..."}
When SnakeBridge changes or you need to refresh generated wrappers, run:
mix snakebridge.regen
Add --clean to remove generated artifacts and metadata before regenerating.
The RLM flagship example uses DSPy’s default PythonInterpreter (Pyodide/WASM), which requires Deno on your PATH.
Quick Start
DSPex.run(fn ->
# 1. Create and configure a language model
lm = DSPex.lm!("gemini/gemini-flash-lite-latest")
DSPex.configure!(lm: lm)
# 2. Create a predictor with a signature
predict = DSPex.predict!("question -> answer")
# 3. Run inference
result = DSPex.method!(predict, "forward", [], question: "What is the capital of France?")
answer = DSPex.attr!(result, "answer")
IO.puts(answer) # => "Paris"
end)Core Concepts
Signatures
DSPy signatures define input/output contracts using a simple arrow syntax:
# Single input/output
predict = DSPex.predict!("question -> answer")
# Multiple fields
predict = DSPex.predict!("context, question -> answer")
# Rich multi-field signatures
predict = DSPex.predict!("title, content -> category, keywords, sentiment")Modules
DSPex supports all DSPy modules:
# Simple prediction
predict = DSPex.predict!("question -> answer")
# Chain-of-thought reasoning (includes intermediate steps)
cot = DSPex.chain_of_thought!("question -> answer")
result = DSPex.method!(cot, "forward", [], question: "What is 15% of 80?")
reasoning = DSPex.attr!(result, "reasoning") # Shows step-by-step thinking
answer = DSPex.attr!(result, "answer")Language Models
Any LiteLLM-compatible provider works out of the box:
# Google Gemini (default)
lm = DSPex.lm!("gemini/gemini-flash-lite-latest", temperature: 0.7)
# OpenAI
lm = DSPex.lm!("openai/gpt-4o-mini")
# Anthropic
lm = DSPex.lm!("anthropic/claude-3-sonnet-20240229")
# Local Ollama
lm = DSPex.lm!("ollama/llama2")Direct LM Calls
Bypass modules and call the LM directly:
{:ok, lm} = Dspy.LM.new("gemini/gemini-flash-lite-latest", [], temperature: 0.9)
# Direct call with messages
messages = [%{"role" => "user", "content" => "Say hello in French"}]
{:ok, response} = Dspy.LM.forward(lm, [], messages: messages)Examples
DSPex includes 20 comprehensive examples demonstrating various use cases:
Use mix run --no-start so DSPex owns the Snakepit lifecycle and closes the
process registry DETS cleanly (avoids repair warnings after unclean exits).
| Example | Description | Run Command |
|---|---|---|
basic.exs | Simple Q&A prediction | mix run --no-start examples/basic.exs |
chain_of_thought.exs | Reasoning with visible steps | mix run --no-start examples/chain_of_thought.exs |
qa_with_context.exs | Context-aware Q&A | mix run --no-start examples/qa_with_context.exs |
multi_hop_qa.exs | Multi-hop question answering | mix run --no-start examples/multi_hop_qa.exs |
rag.exs | Retrieval-augmented generation | mix run --no-start examples/rag.exs |
custom_signature.exs | Signatures with instructions | mix run --no-start examples/custom_signature.exs |
multi_field.exs | Multiple inputs/outputs | mix run --no-start examples/multi_field.exs |
classification.exs | Sentiment analysis | mix run --no-start examples/classification.exs |
entity_extraction.exs | Extract people, orgs, locations | mix run --no-start examples/entity_extraction.exs |
code_gen.exs | Code generation with reasoning | mix run --no-start examples/code_gen.exs |
math_reasoning.exs | Complex math problem solving | mix run --no-start examples/math_reasoning.exs |
summarization.exs | Text summarization | mix run --no-start examples/summarization.exs |
translation.exs | Multi-language translation | mix run --no-start examples/translation.exs |
custom_module.exs | Custom module composition | mix run --no-start examples/custom_module.exs |
optimization.exs | BootstrapFewShot optimization | mix run --no-start examples/optimization.exs |
flagship_multi_pool_gepa.exs | Multi-pool GEPA + numpy analytics pipeline | mix run --no-start examples/flagship_multi_pool_gepa.exs |
flagship_multi_pool_rlm.exs | Multi-pool RLM + numpy analytics pipeline | mix run --no-start examples/flagship_multi_pool_rlm.exs |
rlm/rlm_data_extraction_experiment.exs | RLM data extraction on NYC 311 (real dataset) | mix run --no-start examples/rlm/rlm_data_extraction_experiment.exs |
direct_lm_call.exs | Direct LM interaction | mix run --no-start examples/direct_lm_call.exs |
timeout_test.exs | Timeout configuration demo | mix run --no-start examples/timeout_test.exs |
Realistic RLM benchmark: the NYC 311 data extraction experiment uses 50,000 real records with exact, computable ground truth. On gemini/gemini-flash-lite-latest, an observed run scored RLM 100% vs Direct 0%.
For flagship walkthroughs, see:
guides/flagship_multi_pool_gepa.md(GEPA)guides/flagship_multi_pool_rlm.md(RLM)
Timeout Configuration
DSPex leverages SnakeBridge's timeout architecture, designed for ML inference workloads. By default, all DSPy calls use the :ml_inference profile (10 minute timeout).
Timeout Profiles
| Profile | Timeout | Use Case |
|---|---|---|
:default | 2 min | Standard Python calls |
:streaming | 30 min | Streaming responses |
:ml_inference | 10 min | LLM inference (DSPex default) |
:batch_job | 1 hour | Long-running batch operations |
Per-Call Timeout Override
# Use a different profile
DSPex.method!(predict, "forward", [],
question: "Complex analysis...",
__runtime__: [timeout_profile: :batch_job]
)
# Set exact timeout in milliseconds
DSPex.method!(predict, "forward", [],
question: "Quick question",
__runtime__: [timeout: 30_000] # 30 seconds
)
# Helper functions
opts = DSPex.with_timeout([question: "test"], timeout: 60_000)
DSPex.method!(predict, "forward", [], opts)
# Profile helper
DSPex.method!(predict, "forward", [],
Keyword.merge([question: "test"], DSPex.timeout_profile(:batch_job))
)Global Configuration
# config/config.exs
config :snakebridge,
runtime: [
library_profiles: %{"dspy" => :ml_inference}
]API Reference
DSPex provides a thin wrapper over SnakeBridge's Universal FFI, and the generated
Dspy.* modules expose DSPy 3.2.0's public API surface (generated via SnakeBridge module_mode: :explicit):
Lifecycle
| Function | Description |
|---|---|
DSPex.run/1,2 | Wrap code in Python lifecycle management |
DSPy Helpers
| Function | Description |
|---|---|
DSPex.lm/1,2 | Create a DSPy language model |
DSPex.configure/0,1 | Configure DSPy global settings |
DSPex.predict/1,2 | Create a Predict module |
DSPex.chain_of_thought/1,2 | Create a ChainOfThought module |
Universal FFI
| Function | Description |
|---|---|
DSPex.call/2-4 | Call any Python function or class |
DSPex.method/2-4 | Call a method on a Python object |
DSPex.attr/2 | Get an attribute from a Python object |
DSPex.set_attr/3 | Set an attribute on a Python object |
DSPex.get/2 | Get a module attribute |
DSPex.ref?/1 | Check if a value is a Python object reference |
DSPex.bytes/1 | Encode binary data as Python bytes |
Timeout Helpers
| Function | Description |
|---|---|
DSPex.with_timeout/2 | Add timeout options to call opts |
DSPex.timeout_profile/1 | Get timeout profile opts |
DSPex.timeout_ms/1 | Get exact timeout opts |
All functions have ! variants that raise on error instead of returning {:error, reason}.
Architecture
Generated Dspy.* modules are thin wrappers over SnakeBridge.Runtime; they follow
the same gRPC execution path as the DSPex convenience API.
┌─────────────────────────────────────────────────────────┐
│ Your Elixir App │
├─────────────────────────────────────────────────────────┤
│ DSPex.run/1 │
│ (Python lifecycle wrapper) │
├─────────────────────────────────────────────────────────┤
│ SnakeBridge.call/4 │
│ (Universal FFI) │
├─────────────────────────────────────────────────────────┤
│ Snakepit gRPC │
│ (Python process bridge) │
├─────────────────────────────────────────────────────────┤
│ Python DSPy │
│ (Stanford NLP's LLM framework) │
├─────────────────────────────────────────────────────────┤
│ LLM Providers │
│ (OpenAI, Anthropic, Google, Ollama, etc.) │
└─────────────────────────────────────────────────────────┘Key Design Principles:
- Dual API — Generated
Dspy.*bindings plus the minimalDSPexwrapper - Python-shaped docs — DSPy modules appear as a familiar package tree
- Automatic lifecycle — Snakepit manages Python processes
- Session-aware — Maintains Python state across calls
- Thread-safe — gRPC bridge handles concurrency
Requirements
- Elixir ~> 1.18
- Python 3.9+
- API Key — Set
GEMINI_API_KEY,OPENAI_API_KEY,ANTHROPIC_API_KEY, etc. based on your provider
Related Projects
- DSPy — The Python framework DSPex wraps
- SnakeBridge — The Python-Elixir bridge powering DSPex
- Snakepit — Python process pool and gRPC server
License
MIT License. See LICENSE for details.