PhoenixAI
AI integration library for Elixir inspired by laravel/ai.
PhoenixAI provides a unified API for interacting with multiple AI providers, defining tools, composing sequential pipelines, and running parallel agents — all leveraging the BEAM/OTP concurrency model.
Features
- Multi-provider support — OpenAI, Anthropic, and OpenRouter with a unified API
- Tool calling — Define tools as modules, automatic tool loop execution
- Streaming — Real-time token streaming with backpressure support
- Structured output — JSON schema validation for AI responses
- Agents — Stateful GenServer-based agents with conversation history
- Pipelines — Sequential step composition with context passing
- Teams — Parallel agent execution using
Task.async_stream - Telemetry — Built-in
:telemetryspans for observability - TestProvider — Offline testing with scripted responses
Installation
Add phoenix_ai to your list of dependencies in mix.exs:
def deps do
[
{:phoenix_ai, "~> 0.1.0"}
]
endQuick Start
# Configure a provider
config :phoenix_ai,
provider: :openai,
openai: [
api_key: System.get_env("OPENAI_API_KEY"),
model: "gpt-4o-mini"
]
# Simple chat
{:ok, response} = AI.chat([
%{role: "user", content: "Hello!"}
])
IO.puts(response.content)Documentation
License
MIT License — see LICENSE for details.