LoomEx

An Elixir framework for building AI agents from OTP primitives.

Status: Active Development — LoomEx is being developed and iterated on. APIs may change. Contributions and feedback welcome.

What is LoomEx?

LoomEx weaves conversations, tool calls, and reasoning into coherent AI agents using Elixir's OTP building blocks (GenServer, Task.Supervisor, ETS). Instead of importing a heavyweight framework, you compose agents from simple behaviours and let the BEAM handle concurrency, fault tolerance, and streaming.

Core idea: Define an agent (system prompt + tools + model), call LoomEx.run/3, get streaming results with automatic multi-step tool execution.

defmodule MyAgent do
  use LoomEx.Agent

  def system_prompt(_ctx), do: "You are a helpful coding assistant."
  def tools, do: [LoomEx.Tools.Bash, LoomEx.Tools.ReadFile, LoomEx.Tools.Grep]
  def model, do: "fireworks/accounts/fireworks/models/kimi-k2p5"
end

{:ok, result} = LoomEx.run(MyAgent, [LoomEx.Message.user("Find all TODO comments")],
  sink: fn {:text_delta, d} -> IO.write(d); _ -> :ok end)

Features

Installation

Add LoomEx to your mix.exs:

# From GitHub
{:loom_ex, github: "lulucatdev/loom_ex"}

# Or as a path dependency during development
{:loom_ex, path: "../loom_ex"}

Configure a provider in config/runtime.exs:

config :loom_ex,
  providers: %{
    fireworks: %{
      api_key: System.get_env("FIREWORKS_API_KEY"),
      base_url: "https://api.fireworks.ai/inference/v1/chat/completions"
    },
    openrouter: %{
      api_key: System.get_env("OPENROUTER_API_KEY"),
      base_url: "https://openrouter.ai/api/v1/chat/completions"
    }
  }

Quick Start

Define an Agent

defmodule MyApp.MathAgent do
  use LoomEx.Agent

  @impl true
  def system_prompt(_ctx), do: "You are a math tutor. Use the calculator when needed."

  @impl true
  def tools, do: [MyApp.Tools.Calculator]

  @impl true
  def model, do: "fireworks/accounts/fireworks/models/kimi-k2p5"
end

Define a Tool

defmodule MyApp.Tools.Calculator do
  use LoomEx.Tool

  @impl true
  def name, do: "calculator"

  @impl true
  def description, do: "Evaluate a math expression."

  @impl true
  def parameters do
    %{
      type: "object",
      properties: %{
        expr: %{type: "string", description: "Math expression, e.g. '6 * 7'"}
      },
      required: ["expr"]
    }
  end

  @impl true
  def execute(%{"expr" => expr}, _ctx) do
    {result, _} = Code.eval_string(expr)
    {:ok, %{"result" => result}}
  end
end

Run

# Single execution with streaming
{:ok, result} = LoomEx.run(MyApp.MathAgent, [LoomEx.Message.user("What is 123 * 456?")],
  sink: fn
    {:text_delta, d} -> IO.write(d)
    {:tool_call_complete, tc} -> IO.puts("\n[Tool: #{tc.name}]")
    _ -> :ok
  end)

# result.messages  — full conversation history
# result.steps     — number of LLM calls
# result.usage     — %{"prompt_tokens" => ..., "completion_tokens" => ...}

Multi-turn Conversations

{:ok, pid} = LoomEx.start_agent(MyApp.MathAgent)
{:ok, _} = LoomEx.call(pid, "What is 2 + 2?")
{:ok, _} = LoomEx.call(pid, "Now multiply that by 10")
messages = LoomEx.get_messages(pid)  # full history

Phoenix Controller

defmodule MyAppWeb.ChatController do
  use MyAppWeb, :controller

  def chat(conn, %{"messages" => messages}) do
    {conn, _result} = LoomEx.Phoenix.Plug.stream_agent(conn, MyApp.ChatAgent, messages)
    conn
  end
end

The response streams as Server-Sent Events compatible with the Vercel AI SDKuseChat hook.

Built-in Tools

Tool Module Description
bash LoomEx.Tools.Bash Execute shell commands with timeout and output truncation
read_file LoomEx.Tools.ReadFile Read files with line-numbered pagination
write_file LoomEx.Tools.WriteFile Write files, auto-create directories
edit_file LoomEx.Tools.EditFile Exact string replacement with uniqueness check
grep LoomEx.Tools.Grep Search file contents with regex, glob filtering
glob LoomEx.Tools.Glob Find files by wildcard pattern
human LoomEx.Tools.Human Pause agent, ask user for input, continue

Use them by listing in your agent's tools/0:

def tools, do: [LoomEx.Tools.Bash, LoomEx.Tools.ReadFile, LoomEx.Tools.Grep]

CLI Binary

LoomEx can be packaged as a standalone CLI via Burrito:

# Build (requires Zig: brew install zig)
MIX_ENV=prod mix release

# Run
./burrito_out/loom_ex_macos_arm64 "What is 2+2?"
./burrito_out/loom_ex_macos_arm64 chat --tools bash,grep "Find all TODOs"
./burrito_out/loom_ex_macos_arm64 chat -i --model anthropic/claude-sonnet-4-6

# Pipe support
cat file.ex | ./loom_ex "summarize this code"
git diff | ./loom_ex "review this diff"

Agent Callbacks

Callback Default Description
system_prompt(ctx)required System prompt, receives context map
tools()required List of tool modules
model()required Model string, e.g. "fireworks/model-name"
max_steps()10 Maximum tool-call loops before stopping
temperature()0.1 LLM temperature
context_window()128_000 Fallback context window (auto-resolved from models.dev)
extra_body()%{} Extra fields merged into LLM request body
on_step(info, ctx):continue Called after each tool execution step
on_error(error, ctx){:stop, error} Called on LLM errors

Architecture

LoomEx.run/3 or LoomEx.start_agent/2 + LoomEx.call/3
  |
  v
LoomEx.Agent.Runner (core loop)
  |
  +-- LoomEx.Context.maybe_compact()     auto-compress long conversations
  +-- LoomEx.LLM.Retry.chat_stream()    exponential backoff retry
  |     +-- LoomEx.LLM.Client           Req + SSEParser streaming
  |     +-- LoomEx.LLM.Provider         model string -> provider config
  |     +-- LoomEx.Models (ETS)         models.dev metadata cache
  +-- Tool execution                   Task.Supervisor (parallel)
  +-- LoomEx.Sink                        streaming output
  |     +-- Callback (fn)
  |     +-- Process (pid message)
  |     +-- LoomEx.Phoenix.SSESink       AI SDK v2 SSE protocol
  +-- LoomEx.Telemetry                   structured observability events

Telemetry Events

Event Measurements Metadata
[:loom_ex, :agent, :start] agent, context
[:loom_ex, :agent, :stop] duration, steps agent, result
[:loom_ex, :step, :start] agent, step
[:loom_ex, :step, :stop] duration agent, step
[:loom_ex, :llm, :start] model, message_count
[:loom_ex, :llm, :stop] duration model, finish_reason
[:loom_ex, :llm, :retry] delay_ms model, attempt, error
[:loom_ex, :tool, :start] tool, tool_call_id
[:loom_ex, :tool, :stop] duration tool, tool_call_id
[:loom_ex, :tool, :error] duration tool, error
[:loom_ex, :context, :compact] tokens_before, tokens_after removed, kept

Attach the default logger for development:

LoomEx.Telemetry.attach_default_logger()

Design Principles

Inspirations

License

MIT